2021 IEEE 17th International Conference on Automation Science and Engineering (CASE) 2021
DOI: 10.1109/case49439.2021.9551541
|View full text |Cite
|
Sign up to set email alerts
|

A Comparative Evaluation of Deep Learning Anomaly Detection Techniques on Semiconductor Multivariate Time Series Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 23 publications
0
4
0
Order By: Relevance
“…The machine learning method long short-term memory (LSTM) networks represented the most used method, with 22 occurrences. Furthermore, there were ten LSTM variations: attention-based long short-term memory (ALSTM), which uses a context vector to infer different attention degrees of distinct data features at specific time points [22]; bidirectional long short-term memory (BLSTM), which processes data both in chronological order, from start to end, and in the opposite direction, the reverse order [21,23]; deep long short-term memory (DeepLSTM), an LSTM network with stacked layers connected to a dense layer distributed over time [45]; long short-term memory with nonparametric dynamic thresholding (LSTM-NDT) [38]; long short-term memory variational autoencoder (LSTM-VAE) [38]; singular spectrum analysis bidirectional long short-term memory (SSA-BLSTM) [46]; long short-term memory autoencoder (LSTMAE) [47]; long short-term memory anomaly detection (LSTM-AD) [48]. encoder-decoder anomaly detection (EncDec-AD) [48]; and the ontology-based LSTM neural network (OntoLSTM), which implements semantics concepts using an ontology to learn the representation of a production line, together with an LSTM network for temporal dependencies learning [49].…”
Section: Taxonomymentioning
confidence: 99%
See 1 more Smart Citation
“…The machine learning method long short-term memory (LSTM) networks represented the most used method, with 22 occurrences. Furthermore, there were ten LSTM variations: attention-based long short-term memory (ALSTM), which uses a context vector to infer different attention degrees of distinct data features at specific time points [22]; bidirectional long short-term memory (BLSTM), which processes data both in chronological order, from start to end, and in the opposite direction, the reverse order [21,23]; deep long short-term memory (DeepLSTM), an LSTM network with stacked layers connected to a dense layer distributed over time [45]; long short-term memory with nonparametric dynamic thresholding (LSTM-NDT) [38]; long short-term memory variational autoencoder (LSTM-VAE) [38]; singular spectrum analysis bidirectional long short-term memory (SSA-BLSTM) [46]; long short-term memory autoencoder (LSTMAE) [47]; long short-term memory anomaly detection (LSTM-AD) [48]. encoder-decoder anomaly detection (EncDec-AD) [48]; and the ontology-based LSTM neural network (OntoLSTM), which implements semantics concepts using an ontology to learn the representation of a production line, together with an LSTM network for temporal dependencies learning [49].…”
Section: Taxonomymentioning
confidence: 99%
“…The data science method that was the third-most used was the decision tree method random forest (RF), accumulating 14 occurrences, followed by convolutional neural network (CNN), with 11 occurrences, and recurrent neural network (RNN), with 9 occurrences. Twelve CNN variations stood out as branches: fault detection and classification convolutional neural network (FDC-CNN), designed to detect multivariate sensor signals' faults over a time axis, extracting fault features; multichannel deep convolutional neural networks (MC-DCNN), whose objective is to deal with multiple sensors that generate data with different lengths; multipletime-series convolution neural network (MTS-CNN), designed for diagnosis and fault detection of time series, uses a multichannel CNN to extract important data features [52]; temporal convolutional network (TCN), which works by summarizing signals in time steps, using a maximum and minimum value per step [53]; residual neural networks (ResNet) [54]; residualsqueeze Net (RSNet) [45]; stacked residual dilated convolutional neural network (SRDCNN) [32]; wide first kernel and deep convolutional neural network (WDCNN) [32,55]; convolutional neural network maximum mean discrepancy (CNN-MMD) [33]; deep convolutional transfer learning network (DCTLN) [55]; attention fault detection and classification convolutional neural network (AFDC-CNN) [48]; and the time-series multiple-channel convolutional neural network (TSMC-CNN), which uses as inputs N-variate time series split into segments, smoothing the extraction of data points [22]. RNN represented three branches: gated recurrent unit (GRU), long short-term memory (LSTM), and bidirectional recurrent neural network (BRNN).…”
Section: Taxonomymentioning
confidence: 99%
“…However, like any machine learning technique, their performance is highly dependent on the quality of the data and the specific problem being solved. Self-supervised learning methods based on LSTMs or AEs for fault detection on semiconductor time-series data perform worse than supervised learning methods, as shown in [33]. In [33], CNN-based fault detection methods exhibited the best performances.…”
Section: Supervised Deep Learning For Fault Detectionmentioning
confidence: 99%
“…Self-supervised learning methods based on LSTMs or AEs for fault detection on semiconductor time-series data perform worse than supervised learning methods, as shown in [33]. In [33], CNN-based fault detection methods exhibited the best performances.…”
Section: Supervised Deep Learning For Fault Detectionmentioning
confidence: 99%