2021
DOI: 10.1109/tii.2020.3025204
|View full text |Cite
|
Sign up to set email alerts
|

Variational Autoencoder Bidirectional Long and Short-Term Memory Neural Network Soft-Sensor Model Based on Batch Training Strategy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
25
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 59 publications
(25 citation statements)
references
References 12 publications
0
25
0
Order By: Relevance
“…For example, a deep probabilistic transfer learning framework [12] and a model-agnostic meta-learning method [13] were proposed to improve the model performance when migrating from the relevant source domain to the target domain. In addition, to address the LSTM's inadequacies, a variational autoencoderbased LSTM was designed, which adopts batch training and L2 regularization techniques to learn crucial data information from various process data [10]. In [14], another improved method of LSTM called Gated Convolutional Neural Networkbased Transformer (GCT) was implemented to deal with the gradient vanishing and the parallel computing difficulties.…”
Section: A Data-driven Soft Sensormentioning
confidence: 99%
See 1 more Smart Citation
“…For example, a deep probabilistic transfer learning framework [12] and a model-agnostic meta-learning method [13] were proposed to improve the model performance when migrating from the relevant source domain to the target domain. In addition, to address the LSTM's inadequacies, a variational autoencoderbased LSTM was designed, which adopts batch training and L2 regularization techniques to learn crucial data information from various process data [10]. In [14], another improved method of LSTM called Gated Convolutional Neural Networkbased Transformer (GCT) was implemented to deal with the gradient vanishing and the parallel computing difficulties.…”
Section: A Data-driven Soft Sensormentioning
confidence: 99%
“…Thus, artificial samples can be obtained by randomly sampling from the distribution. In addition, to cope with the complex data characteristics, an improved LSTM with the variational autoencoder [10] is designed to extract the complex temporal information in the industrial time series, while SS-PdeepFM [11] is used to extract low-dimensional and high-dimensional features in a single time step. In summary, there are still several major challenges with existing data-driven approaches, as follows:…”
Section: Introductionmentioning
confidence: 99%
“…The Bi-LSTM model behaves the same with all inputs. The mathematical formulations of Bi-LSTM are presented in detail in [36].…”
Section: Bidirectional Long Short-term Memory (Bi-lstm)mentioning
confidence: 99%
“…In the structure of the traditional RNN and the LSTM model, the propagation of information happens in a forward path, in which case the time t depends only on the information before the time t. In the Bi-LSTM network, unlike in traditional LSTM, flowing the information from the backward layer to the forward layer and upside down is performed by employing a hidden state [36]. Additionally, the advantage of Bi-LSTM over convolutional neural networks (CNNs) is its dependency on the sequence of inputs by taking the forward and backward paths into account.…”
Section: Case Studymentioning
confidence: 99%
“…This can result in a delay in the quality control system as there is no available quality indicator at this time. In the absence of an economical or effective online measurement, soft sensors could serve as an alternative solution [8][9][10][11][12][13][14]. Additionally, with the wide availability of process data in PP processes, increasing data-driven soft sensors have been adopted to predict the MI.…”
Section: Introductionmentioning
confidence: 99%