2000
DOI: 10.1002/(sici)1097-0258(20000315)19:5<723::aid-sim379>3.0.co;2-a
|View full text |Cite
|
Sign up to set email alerts
|

Interval estimation for Cohen's kappa as a measure of agreement

Abstract: Cohen's kappa statistic is a very well known measure of agreement between two raters with respect to a dichotomous outcome. Several expressions for its asymptotic variance have been derived and the normal approximation to its distribution has been used to construct confidence intervals. However, information on the accuracy of these normal‐approximation confidence intervals is not comprehensive. Under the common correlation model for dichotomous data, we evaluate 95 per cent lower confidence bounds constructed … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

1
116
0
1

Year Published

2003
2003
2020
2020

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 167 publications
(118 citation statements)
references
References 31 publications
1
116
0
1
Order By: Relevance
“…The Cohen κ was 0.81 and 0.82 for collateral connection grade, 0.73 and 0.81 for Rentrop classification, 0.75 and 0.83 for wall motion score, and 0.80 and 0.87 for LGE transmurality. 23 The limits of agreement of the LGE volume (%) were −2.1±7.4% and 2.6±6.4% by Bland-Altman analysis, respectively.…”
Section: Discussionmentioning
confidence: 99%
“…The Cohen κ was 0.81 and 0.82 for collateral connection grade, 0.73 and 0.81 for Rentrop classification, 0.75 and 0.83 for wall motion score, and 0.80 and 0.87 for LGE transmurality. 23 The limits of agreement of the LGE volume (%) were −2.1±7.4% and 2.6±6.4% by Bland-Altman analysis, respectively.…”
Section: Discussionmentioning
confidence: 99%
“…A kappa value respectively indicates poor (#0), slight (0.01-0.20), fair (0.21-0.40), moderate (0.41-0.60), substantial (0.61-0.80), and almost perfect (0.81-1.00) agreement. 12 The validity of the CVM method was represented by agreement between the gold standard and estimated staging for the initial time if intraobserver agreement was acceptable. This was also calculated using the weighted kappa statistic.…”
Section: Discussionmentioning
confidence: 99%
“…When the true interest is agreement between raters instead of mere association between ratings, the marginal distributions should not be too disperse. Therefore, it is very important to impose the assumption of homogeneity of marginal distributions on Cohen's j (Blackman and Koval 2000;Block and Kraemer 1989;Brennan and Prediger 1981;Zwick 1988). …”
Section: Assumptions Of Cohen's Jmentioning
confidence: 99%
“…The sampling distribution of j appears to be very non-symmetric when n is small (Blackman and Koval 2000;Block and Kraemer 1989;Koval and Blackman 1996). With a large enough n, the sampling distribution of j is approximately normal so that confidence intervals (CI) and significance tests can be easily done using standard normal distribution quantiles.…”
Section: Sampling Distribution Of Cohen's Jmentioning
confidence: 99%