2022
DOI: 10.1007/s00521-022-08100-9
|View full text |Cite
|
Sign up to set email alerts
|

Black-box error diagnosis in Deep Neural Networks for computer vision: a survey of tools

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 73 publications
0
3
0
Order By: Relevance
“…However, the complex and opaque nature of these architectures often categorizes them as "black boxes", making it challenging to understand their internal workings and decision-making processes. Fraternali et al (2022) highlight the necessity of methods that go beyond standard performance metrics to diagnose model behavior and prediction errors, thereby enhancing our understanding of DNNs. The black-box nature of Artificial Neural Networks (ANNs) raises ethical and forensic concerns, especially in mission-critical applications affecting human life, such as healthcare and finance.…”
Section: Neural Network: Unveiling the Black Boxmentioning
confidence: 99%
“…However, the complex and opaque nature of these architectures often categorizes them as "black boxes", making it challenging to understand their internal workings and decision-making processes. Fraternali et al (2022) highlight the necessity of methods that go beyond standard performance metrics to diagnose model behavior and prediction errors, thereby enhancing our understanding of DNNs. The black-box nature of Artificial Neural Networks (ANNs) raises ethical and forensic concerns, especially in mission-critical applications affecting human life, such as healthcare and finance.…”
Section: Neural Network: Unveiling the Black Boxmentioning
confidence: 99%
“…However, the complex and opaque nature of these architectures often categorizes them as "black boxes", making it challenging to understand their internal workings and decision-making processes. Fraternali et al (2022) highlight the necessity of methods that go beyond standard performance metrics to diagnose model behavior and prediction errors, thereby enhancing our understanding of DNNs.…”
Section: Neural Network: Unveiling the Black Boxmentioning
confidence: 99%
“…Current DL models are plagued by information leakage and only provide limited measures of generalizability, the model's performance in real-world populations. The inherent 'black box' nature of neural networks impedes the interpretation of model functions and the subsequent feedback of knowledge for clinicians [91].…”
mentioning
confidence: 99%