2008
DOI: 10.1111/j.1365-2753.2008.00984.x
|View full text |Cite
|
Sign up to set email alerts
|

Relationships between statistical measures of agreement: sensitivity, specificity and kappa

Abstract: The analytic formulas and graph could be potentially useful to clinicians and biostatisticians in better interpreting the outcomes of an alternative diagnostic test whenever the measures sensitivity, specificity and kappa are employed together.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

4
46
0
1

Year Published

2010
2010
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 61 publications
(51 citation statements)
references
References 10 publications
4
46
0
1
Order By: Relevance
“…The kappa term ranges from Ϫ1 to 1 and can be negative if the agreement is less than what would be expected by chance. The following labels were assigned to the corresponding ranges of kappa strength: poor agreement, Ͻ0; slight, 0.0 to 0.20; fair, 0.21 to 0.40; moderate, 0.41 to 0.60; substantial, 0.61 to 0.80; and almost perfect, 0.81 to 1.00 (12,13). All statistical analyses were performed using SAS version 9.2 (SAS Institute, Cary, NC).…”
Section: Methodsmentioning
confidence: 99%
“…The kappa term ranges from Ϫ1 to 1 and can be negative if the agreement is less than what would be expected by chance. The following labels were assigned to the corresponding ranges of kappa strength: poor agreement, Ͻ0; slight, 0.0 to 0.20; fair, 0.21 to 0.40; moderate, 0.41 to 0.60; substantial, 0.61 to 0.80; and almost perfect, 0.81 to 1.00 (12,13). All statistical analyses were performed using SAS version 9.2 (SAS Institute, Cary, NC).…”
Section: Methodsmentioning
confidence: 99%
“…Although there is a long discussion about the problems of using the kappa coefficient (e.g., [6,15,51]), it is usually reported in those studies. However, no further correction is performed, since the resulting estimation model is not well-identified, making the derivation of a precise-valued estimator impossible.…”
Section: Introductionmentioning
confidence: 97%
“…The differences in the prevalence of sufficient levels of physical activity between questionnaires were evaluated by the intersection of CI 95%. To evaluate the agreement between measures obtained by the simplified questionnaire and detailed questionnaire to correctly classify adolescents as the level of physical activity (sufficiently physically active vs insufficiently physically active), the Kappa index was used (values up to 0.19 were classified as poor; from 0.20 to 0.39, slight agreement; from 0.40 to 0.59, moderate agreement; from 0.60 to 0.79, substantial agreement and values above 0.80, almost perfect 20 ) and sensitivity and specificity measures, adopting as reference the measured obtained with the detailed questionnaire 21 . Statistical analyses were performed using Stata 13 software.…”
Section: Methodological Proceduresmentioning
confidence: 99%
“…(20,1%, IC95%: 18,(6)(7)(8)(9)(10)(11)(12)(13)(14)(15)(16)(17)(18)(19)(20)(21)6) foi inferior a do questionário detalhado (50,2%, IC95%: 48,7). A concordância entre as medidas dos questionários foi leve (k variando de 0,21 a 0,34).…”
unclassified