2021
DOI: 10.1016/j.jbi.2021.103778
|View full text |Cite
|
Sign up to set email alerts
|

A deep attention model to forecast the Length Of Stay and the in-hospital mortality right on admission from ICD codes and demographic data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
14
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 28 publications
(14 citation statements)
references
References 27 publications
0
14
0
Order By: Relevance
“…For instance, Zebin & Chaussalet [12] used an autoencoder deep neural network to categorize short stays (0-7 days) and long stays (>7 days) using Medical Information Mart for Intensive Care III (MIMIC III) dataset [13] but the dataset lacked multi-modalities and methodology does not analyze cohort of risk factors responsible for extended LoS. Likewise, Harerimana et al [14] proposed an attention-based DL method to predict LoS and in-hospital mortality but this method also has limitations of using limited lab data and unavailability of radiological information. Rajkomar et al [15] proposed a three-tier approach by combining three DL models to predict hospital readmission and patients LoS, based on data belonging to patients with varying diseases, without identifying the cohort of risk factors affecting patient LoS.…”
Section: Introductionmentioning
confidence: 99%
“…For instance, Zebin & Chaussalet [12] used an autoencoder deep neural network to categorize short stays (0-7 days) and long stays (>7 days) using Medical Information Mart for Intensive Care III (MIMIC III) dataset [13] but the dataset lacked multi-modalities and methodology does not analyze cohort of risk factors responsible for extended LoS. Likewise, Harerimana et al [14] proposed an attention-based DL method to predict LoS and in-hospital mortality but this method also has limitations of using limited lab data and unavailability of radiological information. Rajkomar et al [15] proposed a three-tier approach by combining three DL models to predict hospital readmission and patients LoS, based on data belonging to patients with varying diseases, without identifying the cohort of risk factors affecting patient LoS.…”
Section: Introductionmentioning
confidence: 99%
“…Zebin & Chaussalet ( Zebin & Chaussalet, 2019 ) proposed to use autoencoder dense neural networks to classify LOS into short (0–7 days) and long stays (>7 days) using the public MIMIC III dataset. Harerimana, Kim, and Jang (2021) proposed a hierarchical attention network to predict LOS and in-hospital mortality. The proposed model was able to leverage the patient anamnesis and free text diagnosis recorded on the first day for prediction purposes.…”
Section: Introductionmentioning
confidence: 99%
“…In other words, the aim is to group the LOS into multiple classes, e.g., short stay, medium stay and long stay, based on the number of days that the patient stays in the hospital. However, several studies have demonstrated that the LOS distributions are highly skewed to the right ( Harerimana et al, 2021 , Ma et al, 2020 ). This skewness indicates that the dataset becomes heavily imbalanced as only a few long LOS cases exist.…”
Section: Introductionmentioning
confidence: 99%
“…Added to this is the technical challenge of understanding the concepts involved in the survival analysis and also of dealing with unbalanced data, as often occurs in factors such as hospital mortality and length of hospital stay. [11]…”
Section: Introductionmentioning
confidence: 99%
“…Added to this is the technical challenge of understanding the concepts involved in the survival analysis and also of dealing with unbalanced data, as often occurs in factors such as hospital mortality and length of hospital stay. [11] This study aimed to estimate the expected survival time, and to evaluate the association and impact of different factors for the survival time of hospitalized patients, based on a cohort of patients hospitalized in CCUs.…”
Section: Introductionmentioning
confidence: 99%