2015
DOI: 10.1016/j.jse.2014.10.025
|View full text |Cite
|
Sign up to set email alerts
|

Agreement of olecranon fractures before and after the exposure to four classification systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
25
1

Year Published

2015
2015
2022
2022

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 24 publications
(26 citation statements)
references
References 18 publications
0
25
1
Order By: Relevance
“…Therefore it is difficult to draw firm conclusions on the usefulness of this parameter. Overall interobserver agreement for assessment of morphologic features of a middle phalanx base fracture was much greater than other anatomic areas (humerus, elbow, clavicle, and olecranon) [1,2,5,8,10]. We found only fair agreement, indicating variation, regarding specific proposed treatments, which might be a reflection of the lack of high-level evidence and relatively small case series on which surgeons base their decisions [7].…”
Section: Discussionmentioning
confidence: 68%
See 2 more Smart Citations
“…Therefore it is difficult to draw firm conclusions on the usefulness of this parameter. Overall interobserver agreement for assessment of morphologic features of a middle phalanx base fracture was much greater than other anatomic areas (humerus, elbow, clavicle, and olecranon) [1,2,5,8,10]. We found only fair agreement, indicating variation, regarding specific proposed treatments, which might be a reflection of the lack of high-level evidence and relatively small case series on which surgeons base their decisions [7].…”
Section: Discussionmentioning
confidence: 68%
“…For all cases, the following seven questions were asked in the following order: (1) An invitation to participate was sent to a list of surgeons built as part of the Science Of Variation Group (SOVG) [11,12]. The SOVG consists of 691 orthopaedic, trauma, and plastic surgeons, all with an interest in treating upper extremity conditions.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Two studies of which we are aware have assessed the reproducibility of the Mayo classification in comparison to the Colton, Schatzker, and AO classifications [3,21]. The referenced studies assessed agreement using a k coefficient rather than observed agreement.…”
Section: Validationmentioning
confidence: 99%
“…Interobserver agreement refers to the agreement between different observers and intraobserver agreement is the measure of repeated agreement of the same observer at different time points. These studies showed an interobserver agreement of the Mayo classification system of k = 0.19 [3] and k = 0.32 [19], which represent poor reliability, with interobserver agreement being only marginally greater than chance alone.…”
Section: Validationmentioning
confidence: 99%