2020
DOI: 10.1038/s41467-020-17431-x
|View full text |Cite
|
Sign up to set email alerts
|

Explainable artificial intelligence model to predict acute critical illness from electronic health records

Abstract: Acute critical illness is often preceded by deterioration of routinely measured clinical parameters, e.g., blood pressure and heart rate. Early clinical prediction is typically based on manually calculated screening metrics that simply weigh these parameters, such as early warning scores (EWS). The predictive performance of EWSs yields a tradeoff between sensitivity and specificity that can lead to negative outcomes for the patient. Previous work on electronic health records (EHR) trained artificial intelligen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
149
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
2
1

Relationship

1
9

Authors

Journals

citations
Cited by 281 publications
(149 citation statements)
references
References 45 publications
0
149
0
Order By: Relevance
“…For example, the AI systems trained on EHRs exhibit substantial predictive performance in relation to real‐time early clinical prediction; however, their clinical translation is difficult because they provide less insight into systems' complex decisions. Recently, an xAI system provided an early prediction model to facilitate clinical translations, which can simultaneously use EHR data to explain model decisions 71 . On the other hand, in multicenter clinical trials, xAI could also support safe communication on different levels, including the study of basic mechanisms, clinical translation, privacy and the data sharing system, and the decisions made by collaborators and managers.…”
Section: Methods and Resultsmentioning
confidence: 99%
“…For example, the AI systems trained on EHRs exhibit substantial predictive performance in relation to real‐time early clinical prediction; however, their clinical translation is difficult because they provide less insight into systems' complex decisions. Recently, an xAI system provided an early prediction model to facilitate clinical translations, which can simultaneously use EHR data to explain model decisions 71 . On the other hand, in multicenter clinical trials, xAI could also support safe communication on different levels, including the study of basic mechanisms, clinical translation, privacy and the data sharing system, and the decisions made by collaborators and managers.…”
Section: Methods and Resultsmentioning
confidence: 99%
“…Other explainability algorithms are model-agnostic, meaning they can be applied to any type of model, regardless of its mathematical basis [ 51 ]. In the study by Lauritsen et al [ 52 ], an explainable EWS was developed based on a temporal convolutional network, using a separate module for explanations. These methodologies are promising, but their application to health care, including to EWS, has been limited.…”
Section: Discussionmentioning
confidence: 99%
“…20(f) for one such explanation). In another work [106], the input time series is formed by EHRs sampled at a much lower rate, from which a DNN model can decide among a set of medical conditions. The medical condition can then be attributed to the different medical measurements.…”
Section: B From Explanations To Novel Scientific Insightsmentioning
confidence: 99%