2019
DOI: 10.1021/acs.iecr.9b00300
|View full text |Cite
|
Sign up to set email alerts
|

Randomized Kernel Principal Component Analysis for Modeling and Monitoring of Nonlinear Industrial Processes with Massive Data

Abstract: Kernel principal component analysis (KPCA) has shown excellent performance in monitoring nonlinear industrial processes. However, model building, updating, and online monitoring using KPCA are generally time-consuming when massive data are obtained under the normal operation condition (NOC). The main reason is that the eigen-decomposition of a high-dimensional kernel matrix constructed from massive NOC samples is computationally complex. Many studies have been devoted to solving this problem through reducing t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
10

Relationship

0
10

Authors

Journals

citations
Cited by 16 publications
(9 citation statements)
references
References 39 publications
0
9
0
Order By: Relevance
“…Dimension reduction was vital to the accuracy of the model because of the strong coupling and redundancy in the indicator system. PCA was able to merge the original features and reduce the dimension to simplify computation, especially aiming at strong linear indicators [50,51]. When the data were processed by PCA, only some principal influencing factors were considered.…”
Section: The Principle Of the Pca-ga-bp Neural Networkmentioning
confidence: 99%
“…Dimension reduction was vital to the accuracy of the model because of the strong coupling and redundancy in the indicator system. PCA was able to merge the original features and reduce the dimension to simplify computation, especially aiming at strong linear indicators [50,51]. When the data were processed by PCA, only some principal influencing factors were considered.…”
Section: The Principle Of the Pca-ga-bp Neural Networkmentioning
confidence: 99%
“…For more information, see the theoretical and empirical comparison of the Nyström method and random Fourier features by Yang et al [318]. Other related low-rank approximation schemes were proposed by Peng et al [283] which applies to kernel ICA, and that of Zhou et al [286] called randomized kernel PCA. Lastly, a different approximation using the Taylor expansion of the RBF kernel was also derived by Wang et al [288,289], and was called kernel sample equivalent replacement.…”
Section: Fast Computation Of Kernel Featuresmentioning
confidence: 99%
“…Lahdhiri et al (2018) developed a novel reduced Rank-KPCA to eliminate the dependencies of variables in the feature space and retain a reduce data from the original one, but it needs a large amount of training data to construct the reduced Rank-KPCA model. Zhou et al (2019) considered the problem of poor performance of a KPCA model constructed from the reduced sample set in monitoring nonlinear industrial process and proposed randomized KPCA for monitoring nonlinear industrial processes with massive data.…”
Section: Introductionmentioning
confidence: 99%