Introduction
Acetabular defect recognition and classification remains a challenging field of practice for orthopedic surgeons. Recently, the Acetabular Defect Classification (ADC) has been introduced to provide a reliable, reproducible and intuitive classification system. In order to improve ease of use and efficiency of the ADC, a browser-based application has been created. We hypothesized that the ADC application can improve rating performance of non-specialists (medical students) to achieve good inter- and intra-rater agreement and will compare favorable to the results of specialists (experienced surgeons) without the help of the application.
Materials and methods
The ADC is based on the integrity of the acetabular rim and the supporting structures. It consists of four main types of defects ascending in severity. These defects are further subdivided in A–C, narrowing down defect location. 80 randomized radiographs were graded according to ADC by three non-specialists (medical students) with help of the ADC application and by three specialists (orthopedic surgeons) without help of the application to evaluate the difference in inter-rater agreement between groups. To account for intra-rater agreement, the rating process was repeated after a reasonable wash-out period.
Results
Inter-rater and intra-rater agreement within the non-specialist group rated lower when compared to the specialist group while still falling into the good agreement range. The student group presented with k values of 0.61 for inter-rater agreement and 0.68 for intra-rater agreement, while the surgeon group displayed k values of 0.72 for inter-rater agreement and 0.83 for intra-rater agreement.
Conclusion
The app-guided assessment of acetabular defects offers a promising innovative approach to simplify complex situations. It makes the challenging field of acetabular revision arthroplasty more approachable especially for less experienced surgeons and offers insight and guidance in the planning stage as well as intra-operative setting.