2006
DOI: 10.1038/modpathol.3800496
|View full text |Cite
|
Sign up to set email alerts
|

Interobserver agreement and reproducibility in classification of invasive breast carcinoma: an NCI breast cancer family registry study

Abstract: Registry of this type; over 37 724 individuals have been enrolled to date. One activity of this Registry is the semicentralized pathologic review of tumors from all probands. Given the semicentralized nature of the review, this study was undertaken to determine the reproducibility, source(s) of classification discrepancies and stratagems to circumvent discrepancies for histologic subtyping and grading of invasive breast cancer among the reviewing pathologists. A total of 13 pathologists reviewed 35 invasive br… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

7
62
0
5

Year Published

2007
2007
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 105 publications
(74 citation statements)
references
References 24 publications
7
62
0
5
Order By: Relevance
“…However, in their study, a prior training session was performed for the pathologists to standardize the evaluation, which was not done in our study. Our results are in line with the published data and show that the interobserver agreement for ILC is moderate [34,35]. Hormone-receptor status and grade were used to further distinguish between more aggressive ILCs (i.e.…”
Section: Discussionsupporting
confidence: 91%
See 1 more Smart Citation
“…However, in their study, a prior training session was performed for the pathologists to standardize the evaluation, which was not done in our study. Our results are in line with the published data and show that the interobserver agreement for ILC is moderate [34,35]. Hormone-receptor status and grade were used to further distinguish between more aggressive ILCs (i.e.…”
Section: Discussionsupporting
confidence: 91%
“…In a study by Kiaer et al the kappa value for ILC versus invasive ductal carcinoma between each central pathology and the country as a whole was 0.3 for a cohort of 379 breast carcinomas. Longacre et al [34] showed in a cohort of N = 35 cases (including five lobular carcinomas) from a cancer registry that the accuracy for diagnosis of lobular carcinoma (comparing local assessment with reference pathology) had a mean of 90 % and a kappa value of 0.8. However, in their study, a prior training session was performed for the pathologists to standardize the evaluation, which was not done in our study.…”
Section: Discussionmentioning
confidence: 99%
“…(Mengel et al 2002, 296) There is a body of literature in which the agreement among pathologists (or lack thereof) has been assessed, generally with kappa coefficients in the moderate to substantial ranges, only a few of which are cited (Coco et al 2011;Dalton, Page, and Dupont 1994;Engers 2007;Farmer, Gonin, and Hanna 1996;Foucar 1998Foucar , 2005Longacre et al 2006;McCluggage et al 2011;Mongomery 2005;Sloane et al 1999;van den Bent 2010). Fortunately, when specific criteria for diagnoses or grading are clearly defined, there is less interobserver variability (Carlson et al 1998;Komaki, Sano, and Tangoku 2006;Letourneux et al 2006;Longacre et al 2006;Rugge et al 2002). In short, there is little evidence that the ''art of pathology'' improves diagnostic accuracy, but there is considerable evidence that well-defined objective and quantifiable criteria for disease diagnosis or tumor grading improves the reproducibility of morphologic assessment.…”
Section: Dunstan Et Al Toxicologic Pathology Qualitative/descriptivementioning
confidence: 99%
“…The reported kappa values differ, depending on the pathologists experience in breast pathology [24]. In our study most of the tumours were evaluated by one pathologist, we do not regard this as a disadvantage.…”
Section: Discussionmentioning
confidence: 95%