2013
DOI: 10.1007/s11814-013-0034-7
|View full text |Cite
|
Sign up to set email alerts
|

Fault detection in nonlinear chemical processes based on kernel entropy component analysis and angular structure

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
14
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 18 publications
(14 citation statements)
references
References 25 publications
0
14
0
Order By: Relevance
“…Entropy is a concept representing the amount of information contained. Studies have shown that by introducing entropy information, the KECA algorithm can achieve better nonlinear processing results than the KPCA algorithm [23][24][25]. Therefore, drawing on the principle of KECA, the KPLS algorithm can also be used to extract the eigenvalues and eigenvectors according to the information entropy.…”
Section: Prediction Model Based On Kepls 51 Principles Of Kepls Algor...mentioning
confidence: 99%
“…Entropy is a concept representing the amount of information contained. Studies have shown that by introducing entropy information, the KECA algorithm can achieve better nonlinear processing results than the KPCA algorithm [23][24][25]. Therefore, drawing on the principle of KECA, the KPLS algorithm can also be used to extract the eigenvalues and eigenvectors according to the information entropy.…”
Section: Prediction Model Based On Kepls 51 Principles Of Kepls Algor...mentioning
confidence: 99%
“…KECA is proposed by R. Jenssen, the original idea of the KECA method is introduced in detail [12], [34]. The definition of Renyi quadratic entropy is as follows:…”
Section: Keca Feature Reduction Algorithmmentioning
confidence: 99%
“…It has proven useful in different applications, e.g. remote sensing data analysis [3], [4], [5], face recognition [6], chemical processes modelling [7], high-dimensional celestial spectra reduction [8] and audio processing [9]. Several extensions have been proposed for feature selection [10], class-dependent feature extraction [11] and semisupervised learning as well [12].…”
Section: Introductionmentioning
confidence: 99%