2014
DOI: 10.1016/j.ijrobp.2014.08.334
|View full text |Cite
|
Sign up to set email alerts
|

Institutional Patient-specific IMRT QA Does Not Predict Unacceptable Plan Delivery

Abstract: Purpose To determine whether in-house patient-specific IMRT QA results predict the Imaging and Radiation Oncology Core (IROC)-Houston phantom results. Methods and Materials IROC Houston’s IMRT head and neck phantoms have been irradiated by numerous institutions as part of clinical trial credentialing. We retrospectively compared these phantom results with those of in-house IMRT QA (following the institution’s clinical process) for 855 irradiations performed between 2003 and 2013. The sensitivity and specific… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

10
122
0
5

Year Published

2016
2016
2019
2019

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 128 publications
(137 citation statements)
references
References 17 publications
10
122
0
5
Order By: Relevance
“…13 This finding is a strong indication that standardized pass-fail criteria could be useful in standardizing quality of care across clinics. Nelms et al concur: "Many forms of relevant systematic errors can go undetected when the currently prevalent metrics for IMRT/ [volumetric arc therapy] commissioning are used.…”
Section: Discussionmentioning
confidence: 98%
“…13 This finding is a strong indication that standardized pass-fail criteria could be useful in standardizing quality of care across clinics. Nelms et al concur: "Many forms of relevant systematic errors can go undetected when the currently prevalent metrics for IMRT/ [volumetric arc therapy] commissioning are used.…”
Section: Discussionmentioning
confidence: 98%
“…Some studies showed the result of planar QA could not predict clinically relevant patient dose error 21 , 22 , 23 , 24 . To overcome the shortcomings of planar QA, some new patient‐specific QA methods have been developed 25 , 26 , 27 .…”
Section: Discussionmentioning
confidence: 99%
“…The discrepancy between the vendor‐provided model and that determined in‐house most likely has a diffuse explanation, partially associated with our choice of QA equipment and partially associated with the natural limitations of dose distribution comparison as a method for tuning a head model (12) . The spatial resolution of the ArcCHECK makes the visibility of small structure in the distributions difficult, and due to the higher physical density of the device with respect to water, the absolute calibration of the device requires careful consideration in Monaco with regard to dose‐to‐medium, dose‐to‐water, and effective relative electron density assigned to the image‐set 11 , 13 , 14 , 15 .…”
Section: Discussionmentioning
confidence: 99%
“…It is likely that the vendor‐assisted procedure could have provided a parameter set that produced calculated point doses closer to those found in ion chamber measurements if we had chosen a water‐equivalent QA system initially. However, given the lack of ion chamber measurements in the ExpressQA package, the reliance on using the absolute calibration of such a QA system remains a concern, and the use of planar comparison metrics with a single pseudoclinical field in the ExpressQA package places limits on the generalizability of any statements of head model accuracy using this methodology 12 , 16 …”
Section: Discussionmentioning
confidence: 99%