2014
DOI: 10.1016/j.knosys.2014.08.027
|View full text |Cite
|
Sign up to set email alerts
|

Feature selection for noisy variation patterns using kernel principal component analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
10
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 18 publications
(10 citation statements)
references
References 22 publications
0
10
0
Order By: Relevance
“…Another type of nonlinear dimensionality reduction method is the kernel-based methods. With the successful application of kernel methods in the support vector machines, kernel-based dimensionality reduction methods, such as Kernel principal component Analysis (KPCA) [25], Kernel discriminant analysis (KDA) [26], Generalized discriminant analysis (GDA) [27], and Kernel Canonical Correlation Analysis (KCCA) [28], are widely used in nonlinear pattern classification. However, the form of kernel function determines the geometric structure of feature space, and has a crucial impact on the performance of kernel based methods.…”
Section: Introductionmentioning
confidence: 99%
“…Another type of nonlinear dimensionality reduction method is the kernel-based methods. With the successful application of kernel methods in the support vector machines, kernel-based dimensionality reduction methods, such as Kernel principal component Analysis (KPCA) [25], Kernel discriminant analysis (KDA) [26], Generalized discriminant analysis (GDA) [27], and Kernel Canonical Correlation Analysis (KCCA) [28], are widely used in nonlinear pattern classification. However, the form of kernel function determines the geometric structure of feature space, and has a crucial impact on the performance of kernel based methods.…”
Section: Introductionmentioning
confidence: 99%
“…In fault diagnosis, we first use these methods to obtain the original signal features. Then, we use other methods, such as genetic algorithm [11][12][13][14], principal component analysis [15,16] and kernel principal component analysis [17,18] to perform necessary selections and transformations to obtain the appropriate features.…”
Section: Introductionmentioning
confidence: 99%
“…It is known that familiar pre-processing methods of dimension reduction include principal component analysis (PCA), linear discriminant analysis, and locally linear embedding (LLE), among others. Sahu et al (2014) proposed a feature selection procedure that augmented kernel PCA to obtain importance estimates of the features using noisy training data. However, PCA can only identify linear relationships among features in the data.…”
Section: Introductionmentioning
confidence: 99%