2018
DOI: 10.1609/aaai.v32i1.11635
|View full text |Cite
|
Sign up to set email alerts
|

Attend and Diagnose: Clinical Time Series Analysis Using Attention Models

Abstract: With widespread adoption of electronic health records, there is an increased emphasis for predictive models that can effectively deal with clinical time-series data. Powered by Recurrent Neural Network (RNN) architectures with Long Short-Term Memory (LSTM) units, deep neural networks have achieved state-of-the-art results in several clinical prediction tasks. Despite the success of RNN, its sequential nature prohibits parallelized computing, thus making it inefficient particularly when processing long sequence… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
99
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 292 publications
(99 citation statements)
references
References 14 publications
0
99
0
Order By: Relevance
“…The representation from the last time step of the last layer is transformed by a dense layer to generate output. • Simply Attend and Diagnose (SaND) [27]: This model also has the same input representation as GRU and the input is passed through a Transformer with causal attention and a dense interpolation layer. • GRU with trainable Decays (GRU-D) [4]: The GRU-D cell takes a vector of variable values at each time one or more measurements are seen.…”
Section: Baseline Methodsmentioning
confidence: 99%
“…The representation from the last time step of the last layer is transformed by a dense layer to generate output. • Simply Attend and Diagnose (SaND) [27]: This model also has the same input representation as GRU and the input is passed through a Transformer with causal attention and a dense interpolation layer. • GRU with trainable Decays (GRU-D) [4]: The GRU-D cell takes a vector of variable values at each time one or more measurements are seen.…”
Section: Baseline Methodsmentioning
confidence: 99%
“…This work uses a combination of temporal convolution network and pointwise convolution network. The model concentrates mainly on handling challenges like missing data, data imbalance and irregular sampling of data.Attention based models that concentrate on improving performance in the length of stay prediction process has been proposed by Vaswani et al [12] and Song et al [13]. These models operate on clinical data and has successfully proved that attention based models slightly outperform LSTM based models.…”
Section: Related Workmentioning
confidence: 99%
“…However, these tools exhibit several issues, the major being the fact that they can only work in surgical departments. Several other improved algorithms were later developed for prediction of LOS [7,8,9]. Prediction time still remains to be the biggest challenge while working on clinical records.…”
Section: Introductionmentioning
confidence: 99%
“…In recent years, time series have gained attention in the scientific community due to their importance in numerous applications (e.g., computer vision, and natural language processing). There are many studies in the literature trying to model these signals with different techniques to improve state-of-the-art performance for different tasks [1][2][3][4]. Modeling times series, in general, is a particularly challenging task due to many factors, such as uncertainty, quality of data acquisition, and data scarcity, to name a few [3][4][5].…”
Section: Introductionmentioning
confidence: 99%
“…There are many studies in the literature trying to model these signals with different techniques to improve state-of-the-art performance for different tasks [1][2][3][4]. Modeling times series, in general, is a particularly challenging task due to many factors, such as uncertainty, quality of data acquisition, and data scarcity, to name a few [3][4][5]. These factors worsen when signals are collected from physiological processes and wearable devices due to the quality of the devices and human factors.…”
Section: Introductionmentioning
confidence: 99%