2004
DOI: 10.1162/0899766041732396
|View full text |Cite
|
Sign up to set email alerts
|

Learning Eigenfunctions Links Spectral Embedding and Kernel PCA

Abstract: In this paper, we show a direct relation between spectral embedding methods and kernel PCA, and how both are special cases of a more general learning problem, that of learning the principal eigenfunctions of an operator defined from a kernel and the unknown data generating density.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
221
0

Year Published

2006
2006
2023
2023

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 315 publications
(221 citation statements)
references
References 13 publications
0
221
0
Order By: Relevance
“…The Nyström formula [15] is a general method to approximate the eigenfunctions ψ j (x) of a kernel from the eigenvectors ψ j (x i ) of a sample-based kernel matrix. As we shall see, in the case of DM, the formula enables to approximately compute the embedding of new patterns without computing again the eigenvalues and eigenvectors of the similarity matrix of the training sample.…”
Section: Nyström Formulamentioning
confidence: 99%
See 2 more Smart Citations
“…The Nyström formula [15] is a general method to approximate the eigenfunctions ψ j (x) of a kernel from the eigenvectors ψ j (x i ) of a sample-based kernel matrix. As we shall see, in the case of DM, the formula enables to approximately compute the embedding of new patterns without computing again the eigenvalues and eigenvectors of the similarity matrix of the training sample.…”
Section: Nyström Formulamentioning
confidence: 99%
“…, x n }, let { j }, {v j } be the eigenvalues and eigenvectors of the samplerestricted kernel matrix k ij = k(x i , x j ). Then, the general Nyström method [15] enables us to approximate the u l eigenfunctions by the following expression that extends the matrix eigenvectors v j to a new y as…”
Section: Nyström Formulamentioning
confidence: 99%
See 1 more Smart Citation
“…This connection provides a metric multidimensional scaling algorithm to solve Kernel PCA instead of a eigendecomposition of the Gram matrix. Bengio et al [5] pointed out the link between Kernel PCA and spectral embedding. The direct relation resides in a more general learning problem: learning the principal eigenfunctions of operators defined from a kernel and the unknown data-generating density function.…”
Section: Introductionmentioning
confidence: 99%
“…Moreover, a vector f i has to be learnt in order to embed an example x i into a K-dimensional space; each example is embedded separately, and after training, given a new example x * there is no straightforward way to embed it. This is referred to as the "out-of-sample" problem, which several authors have tried to address with special techniques (e.g., [6,16,26]). Learning a function-based embedding might prove useful to solve the above problems.…”
Section: Point-based Methodsmentioning
confidence: 99%