A large database of digital chest radiographs was developed over a 14-month period. Ten radiographic technologists and five radiologists independently evaluated a stratified subset of images from the database for quality deficiencies and decided whether each image should be rejected. The evaluation results showed that the radiographic technologists and radiologists agreed only moderately in their assessments. When compared against each other, radiologist and technologist reader groups were found to have even less agreement than the inter-reader agreement within each group. Radiologists were found to be more accepting of limitedquality studies than technologists. Evidence from the study suggests that the technologists weighted their reject decisions more heavily on objective technical attributes, while the radiologists weighted their decisions more heavily on diagnostic interpretability relative to the image indication. A suite of reject-detection algorithms was independently run on the images in the database. The algorithms detected 4 % of postero-anterior chest exams that were accepted by the technologist who originally captured the image but which would have been rejected by the technologist peer group. When algorithm results were made available to the technologists during the study, there was no improvement in inter-reader agreement in deciding whether to reject an image. The algorithm results do, however, provide new quality information that could be captured within a site-wide, reject-tracking database and leveraged as part of a site-wide QA program.
Purpose: To determine whether a proposed suite of objective image quality metrics for digital chest radiographs is useful for monitoring image quality in our clinical operation. Methods: Seventeen gridless AP Chest radiographs from a GE Optima portable digital radiography (DR) unit (Group 1), seventeen (routine) PA Chest radiographs from a GE Discovery DR unit (Group 2), and sixteen gridless (non‐routine) PA Chest radiographs from the same Discovery DR unit (Group 3) were chosen for analysis. Groups were selected to represent “sub‐standard” (Group 1), “standard‐of‐care” (Group 2), and images with a gross technical error (Group 3). Group 1 images were acquired with lower kVp (90 vs. 125), shorter source‐to‐image distance (127cm vs 183cm) and were expected to have lower quality than images in Group 2. Group 3 was expected to have degraded contrast versus Group 2.This evaluation was approved by the institutional Quality Improvement Assurance Board (QIAB). Images were anonymized and securely transferred to the Duke University Clinical Imaging Physics Group for analysis using software previously described1 and validated2. Image quality for individual images was reported in terms of lung grey level(Lgl); lung noise(Ln); rib‐lung contrast(RLc); rib sharpness(Rs); mediastinum detail(Md), noise(Mn), and alignment(Ma); subdiaphragm‐lung contrast(SLc); and subdiaphragm area(Sa). Metrics were compared across groups. Results: Metrics agreed with published Quality Consistency Ranges with three exceptions: higher Lgl, lower RLc, and SDc. Higher bit depth (16 vs 12) accounted for higher Lgl values in our images. Values were most internally consistent for Group 2. The most sensitive metric for distinguishing between groups was Mn followed closely by Ln. The least sensitive metrics were Md and RLc. Conclusion: The software appears promising for objectively and automatically identifying substandard images in our operation. The results can be used to establish local quality consistency ranges and action limits per facility preferences.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.