2014
DOI: 10.1007/s11336-014-9439-4
|View full text |Cite
|
Sign up to set email alerts
|

A New Interpretation of the Weighted Kappa Coefficients

Abstract: Reliability and agreement studies are of paramount importance. They do contribute to the quality of studies by providing information about the amount of error inherent to any diagnosis, score or measurement. Guidelines for reporting reliability and agreement studies were recently provided. While the use of the kappa-like family is advised for categorical and ordinal scales, no further guideline in the choice of a weighting scheme is given. In the present paper, a new simple and practical interpretation of the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
92
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 120 publications
(92 citation statements)
references
References 30 publications
0
92
0
Order By: Relevance
“…The time between responses was 12–14 days. We used weighted Cohen’s kappa [39] with linear and quadratic weights [40]. Additionally, a modified weight of identical answers as 1, directly adjacent as 0.8 and all others as 0 was used, since we expected the majority of retest responses to be within ± 1 of the test response.…”
Section: Methodsmentioning
confidence: 99%
“…The time between responses was 12–14 days. We used weighted Cohen’s kappa [39] with linear and quadratic weights [40]. Additionally, a modified weight of identical answers as 1, directly adjacent as 0.8 and all others as 0 was used, since we expected the majority of retest responses to be within ± 1 of the test response.…”
Section: Methodsmentioning
confidence: 99%
“…With ordered categories, there is usually more disagreement between the classifiers on adjacent categories than on categories that are further apart. Weighted kappa allows the user to describe the closeness between categories using weights (Vanbelle 2015;Warrens 2013Warrens , 2014. The real number 0 ≤ w ij ≤ 1 denotes the weight corresponding to cell (i, j) of tables {π ij } and {π i+ π +j }.…”
Section: Notation and Weighted Kappamentioning
confidence: 99%
“…The agreement between the observers can be used to investigate the reliability of the rating scale. Standard tools for quantifying agreement between classifications with identical categories are Cohen's kappa in the case of nominal categories (Yang and Zhou 2014;Warrens 2010b), and weighted kappa in the case of ordinal categories (Vanbelle 2015;Yang and Zhou 2015;Warrens 2012Warrens , 2013Warrens , 2015. Both coefficients correct for agreement due to chance.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…the proportion of items classified in the same category by the two observers) to avoid that dependency. These absolute coefficients are however not sensitive to the scales' inability in distinguishing between items in a population with low prevalence and kappa coefficients are therefore to be preferred (Rogot and Goldberg, 1966; Vach, 2005; Kraemer et al ., 2004; Vanbelle, 2016). …”
Section: Introductionmentioning
confidence: 99%