On the Equivalence of Multirater Kappas Based on 2-Agreement and 3-Agreement with Binary Scores
Author
Source
ISRN Probability and Statistics
Issue
Vol. 2012, Issue 2012 (31 Dec. 2012), pp.1-11, 11 p.
Publisher
Hindawi Publishing Corporation
Publication Date
2012-10-15
Country of Publication
Egypt
No. of Pages
11
Main Subjects
Abstract EN
Cohen’s kappa is a popular descriptive statistic for summarizing agreement between the classifications of two raters on a nominal scale.
With m≥3 raters there are several views in the literature on how to define agreement.
The concept of g-agreement (g∈{2,3,…,m}) refers to the situation in which it is decided that there is agreement if g out of m raters assign an object to the same category.
Given m≥2 raters we can formulate m−1 multirater kappas, one based on 2-agreement, one based on 3-agreement, and so on, and one based on m-agreement.
It is shown that if the scale consists of only two categories the multi-rater kappas based on 2-agreement and 3-agreement are identical.
American Psychological Association (APA)
Warrens, Matthijs J.. 2012. On the Equivalence of Multirater Kappas Based on 2-Agreement and 3-Agreement with Binary Scores. ISRN Probability and Statistics،Vol. 2012, no. 2012, pp.1-11.
https://search.emarefa.net/detail/BIM-488721
Modern Language Association (MLA)
Warrens, Matthijs J.. On the Equivalence of Multirater Kappas Based on 2-Agreement and 3-Agreement with Binary Scores. ISRN Probability and Statistics No. 2012 (2012), pp.1-11.
https://search.emarefa.net/detail/BIM-488721
American Medical Association (AMA)
Warrens, Matthijs J.. On the Equivalence of Multirater Kappas Based on 2-Agreement and 3-Agreement with Binary Scores. ISRN Probability and Statistics. 2012. Vol. 2012, no. 2012, pp.1-11.
https://search.emarefa.net/detail/BIM-488721
Data Type
Journal Articles
Language
English
Notes
Includes bibliographical references
Record ID
BIM-488721