2020
DOI: 10.1049/htl.2020.0033
|View full text |Cite
|
Sign up to set email alerts
|

Explainable artificial intelligence for heart rate variability in ECG signal

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 24 publications
(4 citation statements)
references
References 39 publications
0
4
0
Order By: Relevance
“…For the end-users of the models, including physicians and clinical experts, understanding the internal workings and decision logic of the models is crucial [27]. This opacity can lead to a decrease in trust in the model's predictions by medical professionals, particularly in medical diagnostic situations where a detailed explanation of the model's behavior is required [7].…”
Section: Interpretability Analysismentioning
confidence: 99%
See 3 more Smart Citations
“…For the end-users of the models, including physicians and clinical experts, understanding the internal workings and decision logic of the models is crucial [27]. This opacity can lead to a decrease in trust in the model's predictions by medical professionals, particularly in medical diagnostic situations where a detailed explanation of the model's behavior is required [7].…”
Section: Interpretability Analysismentioning
confidence: 99%
“…In the research of ECG signal classification, deep learning models such as CNNs have achieved remarkable results [28,29], but the opacity of their decision-making processes can lead to a decrease in trust in clinical applications [7]. In the domain of ECG signal classification, SHAP value analysis has proven to be a powerful tool for identifying and interpreting key features that affect model predictions [7].…”
Section: Comparison With Existing Workmentioning
confidence: 99%
See 2 more Smart Citations