2010
DOI: 10.1002/nla.743
|View full text |Cite
|
Sign up to set email alerts
|

Trace optimization and eigenproblems in dimension reduction methods

Abstract: Abstract. This paper gives an overview of the eigenvalue problems encountered in areas of data mining that are related to dimension reduction. Given some input high-dimensional data, the goal of dimension reduction is to map them to a lowdimensional space such that certain properties of the initial data are preserved. Optimizing the above properties among the reduced data can be typically posed as a trace optimization problem that leads to an eigenvalue problem. There is a rich variety of such problems and the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
95
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 169 publications
(96 citation statements)
references
References 52 publications
1
95
0
Order By: Relevance
“…until convergence two-stage procedure. First, we fix F (W ) (i.e., assume that it does not on W ), and compute the solution of the resulting approximation of (35), which can be achieved by computing the m smallest eigenvectors of F (W ) Kokiopoulou et al (2011). Given the new W , we update F (W ), and iterate.…”
Section: Lemmamentioning
confidence: 99%
“…until convergence two-stage procedure. First, we fix F (W ) (i.e., assume that it does not on W ), and compute the solution of the resulting approximation of (35), which can be achieved by computing the m smallest eigenvectors of F (W ) Kokiopoulou et al (2011). Given the new W , we update F (W ), and iterate.…”
Section: Lemmamentioning
confidence: 99%
“…[Benson et al, 2015, Cunningham and Ghahramani, 2015, Kokiopoulou et al, 2011, Wu et al, 2016. For example, the standard PCA, PLS, orthonormalized PLS (OPLS) and CCA can be formulated as the following EVD/GEVD problems PCA :…”
Section: Tt Network For Tracking a Few Extreme Singular Values And Smentioning
confidence: 99%
“…The solution to this problem consists of a matrix W ∈ R d×k whose k column vectors are the eigenvectors that corresponding to the top-k eigenvalues of the matrix X λ 1 |E| L + λ 2 n I n X ∈ R d×d (Kokiopoulou et al 2011). The computational complexity of finding an optimal projection W consists of two parts: (1) solving a convex optimization problem to obtain the background distribution.…”
Section: Subjectively Interesting Patternsmentioning
confidence: 99%