1998
DOI: 10.1162/089976698300017467
|View full text |Cite
|
Sign up to set email alerts
|

Nonlinear Component Analysis as a Kernel Eigenvalue Problem

Abstract: We describe a new method for performing a nonlinear form of Principal Component A n a lysis. By the use of integral operator kernel functions, we can e ciently compute principal components in high{dimensional feature spaces, related to input space by some nonlinear map for instance the space of all possible 5{pixel products in 16 16 images. We give t h e derivation of the method, along with a discussion of other techniques which c a n b e m a d e nonlinear with the kernel approach and present rst experimental … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

7
3,413
0
38

Year Published

2000
2000
2017
2017

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 6,836 publications
(3,458 citation statements)
references
References 16 publications
7
3,413
0
38
Order By: Relevance
“…Kernel Principal Component Analysis (kernel PCA) (Scholkopf et al, 1998) has been proposed as a nonlinear extension of the standard PCA and has been applied to various purposes including feature extraction, denoising and pre-processing of regression. Kernel PCA is an example of the so-called kernel methods (Scholkopf and Smola, 2002), which aim to extract nonlinear features of the original data by mapping them into a high-dimensional feature space Reproducing Kernel Hilbert Space (RKHS).…”
Section: Introductionmentioning
confidence: 99%
“…Kernel Principal Component Analysis (kernel PCA) (Scholkopf et al, 1998) has been proposed as a nonlinear extension of the standard PCA and has been applied to various purposes including feature extraction, denoising and pre-processing of regression. Kernel PCA is an example of the so-called kernel methods (Scholkopf and Smola, 2002), which aim to extract nonlinear features of the original data by mapping them into a high-dimensional feature space Reproducing Kernel Hilbert Space (RKHS).…”
Section: Introductionmentioning
confidence: 99%
“…We can view kPCA as applying a nonlinear transformation to the m-dimensional features x, φ(x) ∈ R q , to obtain a higher dimensional representation q m. [37,38] However, this might be computationally expensive, especially if the number of dimensions in the original data is high. We therefore use the so-called kernel function to obtain the kernel matrix of the data K(x i , x j ), i, j = 1, · · · , N .…”
Section: Kernel-principal Component Analysismentioning
confidence: 99%
“…, we use the kernel principal component analysis (kPCA) method established by Schölkopf et al in [35]. They showed that λ k and {α k i } m i=1 can be found in terms of the eigenvalues and eigenvectors of a centered kernel matrix.…”
Section: Shape Prior Regularization Termmentioning
confidence: 99%
“…when using (35) to obtain the last equality. Comparing (36) and (30) it is clear that the problem of calculating J prior has been reduced to finding a suitable centered kernel functionk.…”
Section: Shape Prior Regularization Termmentioning
confidence: 99%