Kappa statistic: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Robert Badgett
(Adapted from WP)
 
imported>David E. Volk
No edit summary
Line 1: Line 1:
{{subpages}}
==Interpretation==
==Interpretation==
Landis and Koch<ref name="pmid843571">{{cite journal |author=Landis JR, Koch GG |title=The measurement of observer agreement for categorical data |journal=Biometrics |volume=33 |issue=1 |pages=159–74 |year=1977 |pmid=843571 |doi=}}</ref> proposed the schema in the table below for interpreting  <math>\kappa</math> values.
Landis and Koch<ref name="pmid843571">{{cite journal |author=Landis JR, Koch GG |title=The measurement of observer agreement for categorical data |journal=Biometrics |volume=33 |issue=1 |pages=159–74 |year=1977 |pmid=843571 |doi=}}</ref> proposed the schema in the table below for interpreting  <math>\kappa</math> values.
Line 22: Line 23:
==References==
==References==
<references/>
<references/>
[[Category:CZ Live]] [[Category:Health Sciences Workgroup]]

Revision as of 13:30, 27 December 2007

This article is a stub and thus not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

Interpretation

Landis and Koch[1] proposed the schema in the table below for interpreting values.

Proposed interpretation of values
Interpretation
< 0 Poor agreement
0.0 — 0.20 Slight agreement
0.21 — 0.40 Fair agreement
0.41 — 0.60 Moderate agreement
0.61 — 0.80 Substantial agreement
0.81 — 1.00 Almost perfect agreement

References

  1. Landis JR, Koch GG (1977). "The measurement of observer agreement for categorical data". Biometrics 33 (1): 159–74. PMID 843571[e]