2023
DOI: 10.1109/tii.2021.3129888
|View full text |Cite
|
Sign up to set email alerts
|

Deep Bayesian Slow Feature Extraction With Application to Industrial Inferential Modeling

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 16 publications
(4 citation statements)
references
References 28 publications
0
4
0
Order By: Relevance
“…Hence, Jiang et. al [30] proposed a deep Bayesian extension of the probabilistic slow feature analysis to enhance the nonlinear slow feature extraction using gated recurrent unit (GRU) neural networks. A summary of the proposed model is given below.…”
Section: Deep Bayesian Psfamentioning
confidence: 99%
See 2 more Smart Citations
“…Hence, Jiang et. al [30] proposed a deep Bayesian extension of the probabilistic slow feature analysis to enhance the nonlinear slow feature extraction using gated recurrent unit (GRU) neural networks. A summary of the proposed model is given below.…”
Section: Deep Bayesian Psfamentioning
confidence: 99%
“…The training dataset is used to train the network, and the validation dataset is utilized to obtain the optimal number of latent variables m, nodes in each layer, and each sample length l. The expectations in the lower bound equation are approximated using a Monte-Carlo estimate with twenty samples. The performance of the proposed method on the test data is compared with the other state-of-the-art techniques, including a regular GRU-based auto-encoder (GRU-AE), variational Bayesian complex PSFA (VBCPSFA) [21], and variable-wise deep Bayesian PSFA (VW-DBPSFA) [30]. Table I presents the latent variable dimension, observed variables reconstruction root mean square error (R-RMSE), target variable prediction root mean square error (P-RMSE), and the correlation between the prediction and the actual target variable (ρ) of different methods for two scenarios.…”
Section: A Simulation Case Studymentioning
confidence: 99%
See 1 more Smart Citation
“…As discussed in ( 17) -( 18), the existing deep Bayesian model for PSFA [30] assumes a different and simplified functional form that considers only the information from z z z k−1 and y y y k . The approximate posterior at time k is assumed to be Gaussian distribution, as shown below.…”
Section: B Inference Networkmentioning
confidence: 99%