Rejected images represent both unnecessary radiation exposure to patients and inefficiency in the imaging operation. Rejected images are inherent to projection radiography, where patient positioning and alignment are integral components of image quality. Patient motion and artifacts unique to digital image receptor technology can result in rejected images also. We present a centralized, server-based solution for the collection, archival, and distribution of rejected image and exposure indicator data that automates the data collection process. Reject analysis program (RAP) and exposure indicator data were collected and analyzed during a 1-year period. RAP data were sorted both by reason for repetition and body part examined. Data were also stratified by clinical area for further investigation. The monthly composite reject rate for our institution fluctuated between 8% and 10%. Positioning errors were the main cause of repeated images (77.3%). Stratification of data by clinical area revealed that areas where computed radiography (CR) is seldom used suffer from higher reject rates than areas where it is used frequently. S values were log-normally distributed for examinations performed under either manual or automatic exposure control. The distributions were positively skewed and leptokurtic. S value decreases due to radiologic technology student rotations, and CR plate reader calibrations were observed. Our data demonstrate that reject analysis is still necessary and useful in the era of digital imaging. It is vital though that analysis be combined with exposure indicator analysis, as digital radiography is not self-policing in terms of exposure. When combined, the two programs are a powerful tool for quality assurance.
Since the introduction of Picture Archiving and Communications Systems (PACS) into the medical radiology community, the efficiency and workflow of radiology departments adopting this technology has been improved to a degree much greater than initially anticipated. Although technological advances fuel efficiency and improve workflow, medical mistakes become prevalent as a result. This observation underscores the need for active measures by PACS quality assurance personnel to prevent human and machine errors from contributing to the total of avoidable medical errors, particularly in the area of diagnostic imaging.
Since the introduction of Picture Archiving and Communications Systems (PACS) into the medical radiology community, the efficiency and workflow of radiology departments adopting this technology has been improved to a degree much greater than initially anticipated. Although technological advances fuel efficiency and improve workflow, medical mistakes become prevalent as a result. This observation underscores the need for active measures by PACS quality assurance personnel to prevent human and machine errors from contributing to the total of avoidable medical errors, particularly in the area of diagnostic imaging.
Purpose: To determine whether discrepancies between exposure conditions recommended by the manufacturer of the EZ CR‐DIN phantom, by the DIN standard, and by a user affect evaluation of phantom images. Method and Materials: Computed radiographs (CR) of the Nuclear Associates EZ CR‐DIN phantom (Fluke Biomedical, Cleveland, OH) were acquired in triplicate for three exposure conditions: 72 kVp, the manufacturer's1; 80 kVp, the user's2; and 70 kVp plus 25 mm Al added filtration, the DIN specification3. Aluminum 1100‐H14 was substituted for 99.4% Al.4 kVp was verified non‐invasively. Projections of the phantom used AEC or manual technique on STVI imaging plates (IP) at 100 cm SID in the undertable Bucky tray. IPs were developed without delay with test menus in Semi‐automatic EDR mode using a FCR5000 (FujiFilm USA, Stamford, CT) calibrated for sensitivity and uniformity. DICOM images were transmitted to the PACS system using unity rescale slope and zero intercept. Images were exported from the PACS to a PC and analyzed using MATLAB (MathWorks, Natick, MA). Spectra for the three exposure conditions were generated using a semi‐empirical method.5 Results: Evaluation of low contrast features depended on exposure conditions, test menu selection, and subject contrast values. Contrast was exaggerated using the manufacturer's exposure conditions, but was indistinguishable between user and DIN conditions. Simulated spectra attenuated by the phantom indicated an average energy of 56, 61 and 59 keV for manufacturer, user, and DIN conditions. A high‐contrast test menu improved visibility of low contrast features but did not resolve all six steps of the dynamic range feature. Detected contrasts were more consistent with the manufacturer's subject contrast values than with those in the current DIN standard. Conclusion: Comparison of QC results among institutions depends on standard test objects imaged under standardized conditions. Differences in exposure and development conditions affect evaluation of test images.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.