2019 IEEE International Conference on Advanced Trends in Information Theory (ATIT) 2019
DOI: 10.1109/atit49449.2019.9030505
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised Anomaly Detection in Time Series Using LSTM-Based Autoencoders

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
50
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 99 publications
(50 citation statements)
references
References 0 publications
0
50
0
Order By: Relevance
“…This has a practical meaning since anomalous data are not always available or it is impossible to cover all the types of these data. Many advantages of using the autoencoder approach have been discussed in (Provotar et al, 2019). The use of LSTM autoencoder for anomaly detection on multivariate time series data can be seen in several studies, for example, Pereira and Silveira (2018) and Principi et al (2019).…”
Section: The Needed Conceptsmentioning
confidence: 99%
“…This has a practical meaning since anomalous data are not always available or it is impossible to cover all the types of these data. Many advantages of using the autoencoder approach have been discussed in (Provotar et al, 2019). The use of LSTM autoencoder for anomaly detection on multivariate time series data can be seen in several studies, for example, Pereira and Silveira (2018) and Principi et al (2019).…”
Section: The Needed Conceptsmentioning
confidence: 99%
“…With the continuous development of deep learning, many experts and scholars choose to use deep learning models such as RNN (Recurrent Neural Networks) and LSTM (Long Short-Term Memory) for anomaly detection [14,15]. For example, Hundman et al [16] used LSTM for anomaly detection in spatial telemetry time series data.…”
Section: Kpi Time Series Anomaly Detection Methodsmentioning
confidence: 99%
“…For time series, LSTM auto-encoders capture timely dependencies [12]. The approach by Malhotra et al [12] is a semi-supervised autoencoder, that assumes access to purely normal training data, but auto-encoders can also be trained fully unsupervised on a mixture of normal and abnormal data [13]. With this approach, the assumption is that due to anomalies being underrepresented, if restricted sufficiently, the model will only fit the normal points and the reconstruction error will be larger for abnormal points or subsequences.…”
Section: Related Workmentioning
confidence: 99%