2017
DOI: 10.1111/insr.12220
|View full text |Cite
|
Sign up to set email alerts
|

Online Principal Component Analysis in High Dimension: Which Algorithm to Choose?

Abstract: Summary Principal component analysis (PCA) is a method of choice for dimension reduction. In the current context of data explosion, online techniques that do not require storing all data in memory are indispensable to perform the PCA of streaming data and/or massive data. Despite the wide availability of recursive algorithms that can efficiently update the PCA when new data are observed, the literature offers little guidance on how to select a suitable algorithm for a given application. This paper reviews the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
60
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 84 publications
(61 citation statements)
references
References 45 publications
1
60
0
Order By: Relevance
“…One can use the divide and conquer approach presented in subsection A to compute first the matrix M h = X T h Y h and then evaluate the SVD of this matrix. We present here an alternative approach [58] by considering an incremental version of the SVD.…”
Section: Incremental Svd When N Is Largementioning
confidence: 99%
“…One can use the divide and conquer approach presented in subsection A to compute first the matrix M h = X T h Y h and then evaluate the SVD of this matrix. We present here an alternative approach [58] by considering an incremental version of the SVD.…”
Section: Incremental Svd When N Is Largementioning
confidence: 99%
“…To assess the statistical accuracy of the algorithms considered we choose a metric that we refer to as the subspace error used previously by Cardot and Degras [13]. Given a matrix U ∈ R D×K with orthonormal columns, the orthogonal projector onto the range of U is P U ≡ UU .…”
Section: Performance Metric: Accuracymentioning
confidence: 99%
“…Projecting V n+1 onto the closed convex cone of non negative operators would require to compute the eigenvalues of V n+1 which is time consuming in high dimension even if V n+1 is a rank one perturbation to V n (see Cardot and Degras (2015)). We consider the following simple approximation to this projection which consists in replacing in (13) the descent step γ n by a thresholded one,…”
Section: Efficient Recursive Algorithmsmentioning
confidence: 99%