2018
DOI: 10.3174/ajnr.a5667
|View full text |Cite
|
Sign up to set email alerts
|

Deep-Learning Convolutional Neural Networks Accurately Classify Genetic Mutations in Gliomas

Abstract: BACKGROUND AND PURPOSE:The World Health Organization has recently placed new emphasis on the integration of genetic information for gliomas. While tissue sampling remains the criterion standard, noninvasive imaging techniques may provide complimentary insight into clinically relevant genetic mutations. Our aim was to train a convolutional neural network to independently predict underlying molecular genetic mutation status in gliomas with high accuracy and identify the most predictive imaging features for each … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

6
291
1
1

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 390 publications
(331 citation statements)
references
References 45 publications
6
291
1
1
Order By: Relevance
“…The 95% confidence intervals were calculated as exact Clopper‐Pearson confidence intervals. The AUC value as a criterion of discrimination accuracy was classified as low (0.5–0.7), moderate (0.7–0.9), or high (>0.9) . The AUC of the individual models was compared with the DeLong method .…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…The 95% confidence intervals were calculated as exact Clopper‐Pearson confidence intervals. The AUC value as a criterion of discrimination accuracy was classified as low (0.5–0.7), moderate (0.7–0.9), or high (>0.9) . The AUC of the individual models was compared with the DeLong method .…”
Section: Methodsmentioning
confidence: 99%
“…The AUC value as a criterion of discrimination accuracy was classified as low (0.5-0.7), moderate (0.7-0.9), or high (>0.9). 15 The AUC of the individual models was compared with the DeLong method. 19 For a thorough evaluation of the accuracy of our test, and to account for the strong imbalance between benign and atypical/anaplastic meningiomas in our database, the Matthews Correlation Coefficient was calculated using the Multi Class Confusion Matrix function embedded in MatLab.…”
Section: Statistics and Data Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…Delfanti et al demonstrated that genomic information with fluid attenuated inversion recovery (FLAIR) MRI could be used for the classification of patient images into IDH wild type, and IDH mutation with and without 1p/19q co-deletion (11 (13). In another approach, Chang et al (14) similarly demonstrated that IDH mutation status can be determined using T2-weighted (T2w), T2w-Fluid attenuated inversion recovery (FLAIR) and T1-weighted pre-and post-contrast images. Preprocessing steps in their work included coregistration of all sequences, intensity normalization using zero mean and unit variance, application of a 3D convolutional neural network (CNN) based whole tumor segmentation tool for segmenting the lesion margins, cropping the output tumor mask on all input imaging sequences, and resizing individual image slices to 32 x 32 with 4 input sequence channels.…”
Section: Introductionmentioning
confidence: 99%
“…Preprocessing steps in their work included coregistration of all sequences, intensity normalization using zero mean and unit variance, application of a 3D convolutional neural network (CNN) based whole tumor segmentation tool for segmenting the lesion margins, cropping the output tumor mask on all input imaging sequences, and resizing individual image slices to 32 x 32 with 4 input sequence channels. The mean accuracy result from the model was 94% with a 5-fold cross validation accuracy ranging from 90% to 96% (14). Common to all of these previous methods is the involvement of preprocessing steps, typically including some form of brain tumor pre-segmentation or region of interest extraction, and utilizing multiparametric or 3D near-isotropic MRI data that is often not part of the standard clinical imaging protocol (12,14).…”
Section: Introductionmentioning
confidence: 99%