2016
DOI: 10.3899/jrheum.151300
|View full text |Cite
|
Sign up to set email alerts
|

Reliability and Accuracy of Cross-sectional Radiographic Assessment of Severe Knee Osteoarthritis: Role of Training and Experience

Abstract: Objective To determine the reliability of radiographic assessment of knee osteoarthritis (OA) by non-clinician readers compared to an experienced radiologist. Methods The radiologist trained three non-clinicians to evaluate radiographic characteristics of knee OA. The radiologist and non-clinicians read preoperative films of 36 patients prior to total knee replacement. Intra- and inter-reader reliability was measured using the weighted kappa statistic and intra-class correlation coefficient (ICC). Kappa <0.2… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

3
15
1

Year Published

2016
2016
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 24 publications
(19 citation statements)
references
References 15 publications
3
15
1
Order By: Relevance
“…A recent study aimed to determine the reliability of radiographic assessment of knee osteoarthritis (OA) by non-clinician readers compared to an experienced radiologist. 23 This study showed an intra-reader reliability among radiologist (kappa) ranged from 0.40 to 1.0 for individual radiographic features and 0.72-1.0 for Kellgren-Lawrence (K-L) grade. Inter-reader agreement among non-clinicians ranged from kappa of 0.45-0.94 for individual features, and 0.66-0.97 for K-L grade.…”
Section: Advances In Radiographic Assessment Of Knee Oamentioning
confidence: 75%
See 1 more Smart Citation
“…A recent study aimed to determine the reliability of radiographic assessment of knee osteoarthritis (OA) by non-clinician readers compared to an experienced radiologist. 23 This study showed an intra-reader reliability among radiologist (kappa) ranged from 0.40 to 1.0 for individual radiographic features and 0.72-1.0 for Kellgren-Lawrence (K-L) grade. Inter-reader agreement among non-clinicians ranged from kappa of 0.45-0.94 for individual features, and 0.66-0.97 for K-L grade.…”
Section: Advances In Radiographic Assessment Of Knee Oamentioning
confidence: 75%
“…Performance of both these studies are comparable to human reliability. A recent study aimed to determine the reliability of radiographic assessment of knee osteoarthritis (OA) by non‐clinician readers compared to an experienced radiologist . This study showed an intra‐reader reliability among radiologist (kappa) ranged from 0.40 to 1.0 for individual radiographic features and 0.72–1.0 for Kellgren–Lawrence (K–L) grade.…”
Section: Advances In Radiographic Assessment Of Knee Oamentioning
confidence: 99%
“…The discrete binning of disease classes is human-engineered and the underlying biology of disease is usually more accurately described as a spectrum 4 . In addition, previous work has shown that there is substantial intra-and inter-expert variability in the annotation of both plus disease classification and KL grade 4,21,22 .…”
Section: Discussionmentioning
confidence: 99%
“…Inter-observer agreement of image quality was analyzed with a Kappa test. Levels of agreement were defined as follows: k < 0.20 indicated slight agreement; k = 0.20-0.40, fair agreement; k = 0.41-0.60, moderate agreement; k = 0.61-0.80, substantial agreement; k = 0.81-1.0, excellent agreement [5]. One-way ANOVA was used for measuring phantom size changes in three anatomy positions at different table heights.…”
Section: Discussionmentioning
confidence: 99%