2022 30th European Signal Processing Conference (EUSIPCO) 2022
DOI: 10.23919/eusipco55093.2022.9909776
|View full text |Cite
|
Sign up to set email alerts
|

On Interpretability of CNNs for Multimodal Medical Image Segmentation

Abstract: Despite their huge potential, deep learning-based models are still not trustful enough to warrant their adoption in clinical practice. The research on the interpretability and explainability of deep learning is currently attracting huge attention. Multilayer Convolutional Sparse Coding (ML-CSC) data model, provides a model-based explanation of convolutional neural networks (CNNs). In this article, we extend the ML-CSC framework towards multimodal data for medical image segmentation, and propose a merged joint … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 17 publications
0
2
0
Order By: Relevance
“…Despite the effectiveness of deep learning techniques, medical image segmentation frequently faces issues if there is an insufficient number of labelled data. To efficiently use the data that is currently accessible and to draw knowledge from different fields, researchers have proposed transfer learning approaches and data augmentation methodologies [5], [9]. Another issue includes the necessity for interpretable models and the generalisation of models across various datasets.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Despite the effectiveness of deep learning techniques, medical image segmentation frequently faces issues if there is an insufficient number of labelled data. To efficiently use the data that is currently accessible and to draw knowledge from different fields, researchers have proposed transfer learning approaches and data augmentation methodologies [5], [9]. Another issue includes the necessity for interpretable models and the generalisation of models across various datasets.…”
Section: Related Workmentioning
confidence: 99%
“…Another issue includes the necessity for interpretable models and the generalisation of models across various datasets. To improve medical image segmentation for diverse clinical applications, additional research is required to solve these problems and investigate novel architectures [9], [10] for better segmentation results.…”
Section: Related Workmentioning
confidence: 99%