2021
DOI: 10.1016/j.ejmp.2021.03.009
|View full text |Cite
|
Sign up to set email alerts
|

Artificial intelligence: Deep learning in oncological radiomics and challenges of interpretability and data harmonization

Abstract: Over the last decade there has been an extensive evolution in the Artificial Intelligence (AI) field. Modern radiation oncology is based on the exploitation of advanced computational methods aiming to personalization and high diagnostic and therapeutic precision. The quantity of the available imaging data and the increased developments of Machine Learning (ML), particularly Deep Learning (DL), triggered the research on uncovering "hidden" biomarkers and quantitative features from anatomical and functional medi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
85
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
10

Relationship

2
8

Authors

Journals

citations
Cited by 124 publications
(85 citation statements)
references
References 142 publications
0
85
0
Order By: Relevance
“…Radiomic flow is a complex process, and every aspect of the image acquisition, such as defining and contouring the regions of interests, and choosing the best features to be extracted and the proper statistics to be applied, remains challenging. Lately, explainable AI (XAI), using DNNs (deep neural networks), may help radiomics in classification and prediction in the clinical setting [ 174 , 175 ], and a controllable and explainable probabilistic radiomics framework was proposed, through which a 3D CNN feature is extracted upon the lesion region only and is used to approximate the ambiguity distribution over human experts [ 176 ]. These new features will have to be further validated.…”
Section: Discussionmentioning
confidence: 99%
“…Radiomic flow is a complex process, and every aspect of the image acquisition, such as defining and contouring the regions of interests, and choosing the best features to be extracted and the proper statistics to be applied, remains challenging. Lately, explainable AI (XAI), using DNNs (deep neural networks), may help radiomics in classification and prediction in the clinical setting [ 174 , 175 ], and a controllable and explainable probabilistic radiomics framework was proposed, through which a 3D CNN feature is extracted upon the lesion region only and is used to approximate the ambiguity distribution over human experts [ 176 ]. These new features will have to be further validated.…”
Section: Discussionmentioning
confidence: 99%
“…Despite the encouraging results that we obtained, the ML-based decision system we have developed is not yet ready to be applied in clinical workflows. Before application-specific ML algorithms can be successfully translated into clinics, a number of challenges must indeed be overcome, including the harmonization of different data samples [7,59], the reliability and reproducibility of the results [7,10,60] the interpretability of the models and the results [59,61], and the compliance with current regulations [62].…”
Section: Tablementioning
confidence: 99%
“…One key challenge of applying deep networks in clinical decision making is that deep networks are black box models with multilayer nonlinear operations, thus the reasoning behind the results from deep networks are very difficult to interpret clinically. Explainable AI is an emerging field of active research in trying to address this challenge [ 90 , 91 ].…”
Section: Machine Learning and Radiomics Workflow For Oncology Imagingmentioning
confidence: 99%