2022
DOI: 10.1016/j.inffus.2021.07.016
|View full text |Cite
|
Sign up to set email alerts
|

Unbox the black-box for the medical explainable AI via multi-modal and multi-centre data fusion: A mini-review, two showcases and beyond

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
203
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
9

Relationship

2
7

Authors

Journals

citations
Cited by 433 publications
(203 citation statements)
references
References 138 publications
0
203
0
Order By: Relevance
“…We propose a segmentation model that can automatically segment the CT images and MRI images regardless of the thickness. In our previous research, the feasibility of this model was verified [ 21 , 29 ].…”
Section: Methodsmentioning
confidence: 99%
“…We propose a segmentation model that can automatically segment the CT images and MRI images regardless of the thickness. In our previous research, the feasibility of this model was verified [ 21 , 29 ].…”
Section: Methodsmentioning
confidence: 99%
“…According to [21,22], post hoc explainability methods can be categorized as: dimension reduction, attention mechanism, restricted neural network architecture, text explanation, visual explanation, local explanation, explanation by example, explanation by simplification and feature relevance. The taxonomy of XAI methods is shown in Figure 2.…”
Section: Related Explainable Artificial Intelligence Conceptsmentioning
confidence: 99%
“…Other post hoc methods include SHapley Additive exPlanations (SHAP) [ 17 ], class activation mapping (CAM) [ 18 ], principle component analysis (PCA) [ 19 ] and Gradient-weighted class activation mapping (Grad-CAM) [ 20 ]. According to [ 21 , 22 ], post hoc explainability methods can be categorized as: dimension reduction, attention mechanism, restricted neural network architecture, text explanation, visual explanation, local explanation, explanation by example, explanation by simplification and feature relevance. The taxonomy of XAI methods is shown in Figure 2 .…”
Section: Introductionmentioning
confidence: 99%
“…Recently, plenty of COVID-19 recognition algorithms have been proposed, which have adopted artificial intelligence algorithms especially using the CNN. A comprehensive review of artificial intelligence assisted COVID-19 detection and diagnosis can be found elsewhere [22] , [23] , [24] , [25] , [26] , and here we only provided a summary for the most relevant studies.…”
Section: Related Workmentioning
confidence: 99%