2023
DOI: 10.1097/icu.0000000000000983
|View full text |Cite
|
Sign up to set email alerts
|

Explainable artificial intelligence in ophthalmology

Ting Fang Tan,
Peilun Dai,
Xiaoman Zhang
et al.

Abstract: Purpose of review Despite the growing scope of artificial intelligence (AI) and deep learning (DL) applications in the field of ophthalmology, most have yet to reach clinical adoption. Beyond model performance metrics, there has been an increasing emphasis on the need for explainability of proposed DL models. Recent findings Several explainable AI (XAI) methods have been proposed, and increasingly applied in ophthalmological DL applications, predominant… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(1 citation statement)
references
References 51 publications
(77 reference statements)
0
1
0
Order By: Relevance
“…This has implications in establishing the robustness of AI models with data from different sources, and also culpability in the case of incorrect predictions. As such, explainable AI (XAI) has understandably become an important consideration in AI model acceptance [ 82 ].…”
Section: Main Textmentioning
confidence: 99%
“…This has implications in establishing the robustness of AI models with data from different sources, and also culpability in the case of incorrect predictions. As such, explainable AI (XAI) has understandably become an important consideration in AI model acceptance [ 82 ].…”
Section: Main Textmentioning
confidence: 99%