Inter-rater reliability/Related Articles: Difference between revisions
Jump to navigation
Jump to search
imported>Daniel Mietchen m (Robot: Creating Related Articles subpage) |
imported>Daniel Mietchen m (Robot: encapsulating subpages template in noinclude tag) |
||
Line 1: | Line 1: | ||
{{subpages}} | <noinclude>{{subpages}}</noinclude> | ||
==Parent topics== | ==Parent topics== |
Revision as of 18:52, 11 September 2009
- See also changes related to Inter-rater reliability, or pages that link to Inter-rater reliability or to this page or whose text contains "Inter-rater reliability".
Parent topics
Subtopics
Bot-suggested topics
Auto-populated based on Special:WhatLinksHere/Inter-rater reliability. Needs checking by a human.
- Fleiss' kappa [r]: Statistical measure for assessing the reliability of agreement between a fixed number of raters when assigning categorical ratings to a number of items or classifying items. [e]