BackgroundNon-small-cell lung cancer (NSCLC) patients often demonstrate varying clinical courses and outcomes, even within the same tumor stage. This study explores deep learning applications in medical imaging allowing for the automated quantification of radiographic characteristics and potentially improving patient stratification.Methods and findingsWe performed an integrative analysis on 7 independent datasets across 5 institutions totaling 1,194 NSCLC patients (age median = 68.3 years [range 32.5–93.3], survival median = 1.7 years [range 0.0–11.7]). Using external validation in computed tomography (CT) data, we identified prognostic signatures using a 3D convolutional neural network (CNN) for patients treated with radiotherapy (n = 771, age median = 68.0 years [range 32.5–93.3], survival median = 1.3 years [range 0.0–11.7]). We then employed a transfer learning approach to achieve the same for surgery patients (n = 391, age median = 69.1 years [range 37.2–88.0], survival median = 3.1 years [range 0.0–8.8]). We found that the CNN predictions were significantly associated with 2-year overall survival from the start of respective treatment for radiotherapy (area under the receiver operating characteristic curve [AUC] = 0.70 [95% CI 0.63–0.78], p < 0.001) and surgery (AUC = 0.71 [95% CI 0.60–0.82], p < 0.001) patients. The CNN was also able to significantly stratify patients into low and high mortality risk groups in both the radiotherapy (p < 0.001) and surgery (p = 0.03) datasets. Additionally, the CNN was found to significantly outperform random forest models built on clinical parameters—including age, sex, and tumor node metastasis stage—as well as demonstrate high robustness against test–retest (intraclass correlation coefficient = 0.91) and inter-reader (Spearman’s rank-order correlation = 0.88) variations. To gain a better understanding of the characteristics captured by the CNN, we identified regions with the most contribution towards predictions and highlighted the importance of tumor-surrounding tissue in patient stratification. We also present preliminary findings on the biological basis of the captured phenotypes as being linked to cell cycle and transcriptional processes. Limitations include the retrospective nature of this study as well as the opaque black box nature of deep learning networks.ConclusionsOur results provide evidence that deep learning networks may be used for mortality risk stratification based on standard-of-care CT images from NSCLC patients. This evidence motivates future research into better deciphering the clinical and biological basis of deep learning networks as well as validation in prospective data.
Purpose: Tumors are continuously evolving biological systems, and medical imaging is uniquely positioned to monitor changes throughout treatment. Although qualitatively tracking lesions over space and time may be trivial, the development of clinically relevant, automated radiomics methods that incorporate serial imaging data is far more challenging. In this study, we evaluated deep learning networks for predicting clinical outcomes through analyzing time series CT images of patients with locally advanced non-small cell lung cancer (NSCLC).Experimental Design: Dataset A consists of 179 patients with stage III NSCLC treated with definitive chemoradiation, with pretreatment and posttreatment CT images at 1, 3, and 6 months follow-up (581 scans). Models were developed using transfer learning of convolutional neural networks (CNN) with recurrent neural networks (RNN), using single seed-point tumor localization. Pathologic response validation was performed on dataset B, comprising 89 patients with NSCLC treated with chemoradiation and surgery (178 scans).Results: Deep learning models using time series scans were significantly predictive of survival and cancer-specific outcomes (progression, distant metastases, and local-regional recurrence). Model performance was enhanced with each additional follow-up scan into the CNN model (e.g., 2-year overall survival: AUC ¼ 0.74, P < 0.05). The models stratified patients into low and high mortality risk groups, which were significantly associated with overall survival [HR ¼ 6.16; 95% confidence interval (CI), 2.17-17.44; P < 0.001]. The model also significantly predicted pathologic response in dataset B (P ¼ 0.016).Conclusions: We demonstrate that deep learning can integrate imaging scans at multiple timepoints to improve clinical outcome predictions. AI-based noninvasive radiomics biomarkers can have a significant impact in the clinic given their low cost and minimal requirements for human input.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.