2022
DOI: 10.3390/make4020015
|View full text |Cite
|
Sign up to set email alerts
|

An Attention-Based ConvLSTM Autoencoder with Dynamic Thresholding for Unsupervised Anomaly Detection in Multivariate Time Series

Abstract: As a substantial amount of multivariate time series data is being produced by the complex systems in smart manufacturing (SM), improved anomaly detection frameworks are needed to reduce the operational risks and the monitoring burden placed on the system operators. However, building such frameworks is challenging, as a sufficiently large amount of defective training data is often not available and frameworks are required to capture both the temporal and contextual dependencies across different time steps while… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 27 publications
(12 citation statements)
references
References 57 publications
0
12
0
Order By: Relevance
“…For example, if an anomaly is detected, the channel that most likely caused the obtained model response could be highlighted to direct attention to a potential problem. So far, only Tayeh et al (2022), Zhang et al (2019), Homayouni et al (2020 and Su et al (2019) have proposed approaches that provide information about the problematic input channel. It is challenging to establish a high level of trust in data-driven models which holds back their wide-spread adoption.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…For example, if an anomaly is detected, the channel that most likely caused the obtained model response could be highlighted to direct attention to a potential problem. So far, only Tayeh et al (2022), Zhang et al (2019), Homayouni et al (2020 and Su et al (2019) have proposed approaches that provide information about the problematic input channel. It is challenging to establish a high level of trust in data-driven models which holds back their wide-spread adoption.…”
Section: Discussionmentioning
confidence: 99%
“…The algorithm can also provide a root cause analysis by labelling the channels associated with the three worst reconstructed correlations in a given matrix. Tayeh et al (2022) proposed a similar an autoencoder structure similar to Zhang et al (2019), though without the skip connections. Here a Bahdanaustyle attention mechanism (Bahdanau et al, 2015) has been added to the model to maintain performance with increasing length of input sequences.…”
Section: Offline Training and Offline Inference But Online Capablementioning
confidence: 99%
“…Here, the food crop classification process is performed through the ConvLSTM model. LSTM-DNN model is a kind of Recurrent Neural Network (RNN) that excels in modelling temporal behaviors like | text, language, audio and time series, owing to the additional parameter metric available for the connection among time steps along with the feedback loop used for learning [25]. The main component of LSTM is forget gate, input gate, output gate and memory cells.…”
Section: Food Crop Classification Using Convlstmmentioning
confidence: 99%
“…Another reason that we opt in for ConvLSTM is also ability to scale up the input (and output if necessary) parameter space without causing computational overload as much as other methods that utilize covariance matrix such as Gaussian Process. There are multiple applications in the literature that use ConvLSTM for multi-variate timeseries prediction (Xiao et al, 2021;Mu et al, 2019;Tayeh et al, 2022). In the future, we are aiming to include many other relevant sensor variables (temperature, current, etc.)…”
Section: Convlstm Model and Training Parametersmentioning
confidence: 99%