2023
DOI: 10.1007/s41060-023-00401-z
|View full text |Cite
|
Sign up to set email alerts
|

Improving trust and confidence in medical skin lesion diagnosis through explainable deep learning

Abstract: A key issue in critical contexts such as medical diagnosis is the interpretability of the deep learning models adopted in decision-making systems. Research in eXplainable Artificial Intelligence (XAI) is trying to solve this issue. However, often XAI approaches are only tested on generalist classifier and do not represent realistic problems such as those of medical diagnosis. In this paper, we aim at improving the trust and confidence of users towards automatic AI decision systems in the field of medical skin … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2024
2024
2025
2025

Publication Types

Select...
5
1
1

Relationship

2
5

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 33 publications
0
4
0
Order By: Relevance
“…The rapid ascent in ABELE's insertion curve signifies that its saliency map more accurately identifies the image segments most critical to the classifier's decisionmaking process. ABELE insertion curve grows much earlier with respect to LIME and LORE (from [17]).…”
Section: Explaining Via Saliency Mapmentioning
confidence: 99%
See 2 more Smart Citations
“…The rapid ascent in ABELE's insertion curve signifies that its saliency map more accurately identifies the image segments most critical to the classifier's decisionmaking process. ABELE insertion curve grows much earlier with respect to LIME and LORE (from [17]).…”
Section: Explaining Via Saliency Mapmentioning
confidence: 99%
“…This paper aims to build upon and refine the methodologies discussed in [3,[15][16][17], exploring the application of an explanation method in a genuine medical context, specifically for diagnosing skin lesions from images. Utilizing the labeled dataset from the ISIC 2019 (International Skin Imaging Collaboration) challenge (https://challenge.isic-archive.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…However, the effectiveness of these models in clinical settings hinges on their interpretability and the transparency of their decision making, ensuring healthcare professionals can integrate AI insights confidently into patient care. In [42][43][44][45], a case study on skin lesion diagnosis using a ResNet classifier trained on the ISIC (International Skin Imaging Collaboration) dataset is presented. The classifier's decisions are explained using ABELE.…”
Section: The International Skin Imaging Collaborationmentioning
confidence: 99%