2018
DOI: 10.1109/msp.2018.2826566
|View full text |Cite
|
Sign up to set email alerts
|

Robust Subspace Learning: Robust PCA, Robust Subspace Tracking, and Robust Subspace Recovery

Abstract: PCA is one of the most widely used dimension reduction techniques. A related easier problem is "subspace learning" or "subspace estimation". Given relatively clean data, both are easily solved via singular value decomposition (SVD). The problem of subspace learning or PCA in the presence of outliers is called robust subspace learning or robust PCA (RPCA). For long data sequences, if one tries to use a single lower dimensional subspace to represent the data, the required subspace dimension may end up being quit… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
124
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 301 publications
(125 citation statements)
references
References 75 publications
1
124
0
Order By: Relevance
“…The recovery problem (7) is a convex problem that can be solved by generic convex optimization solvers. The computational complexity may grow fast with the matrix dimension, thereby calling for accelerated solutions such as alternating minimization [27] or adaptive updates using subspace learning approaches [28], which will be explored in the future to develop fast algorithms for solving (7). One main issue of the proposed recovery formulation (7) is that the L1-norm regularization may penalize the magnitude of nonzero entries of D. Hence, the solu-tionD tends to be smaller in magnitude than the actual values and is thus biased.…”
Section: Spatio-temporal Load Recoverymentioning
confidence: 99%
See 1 more Smart Citation
“…The recovery problem (7) is a convex problem that can be solved by generic convex optimization solvers. The computational complexity may grow fast with the matrix dimension, thereby calling for accelerated solutions such as alternating minimization [27] or adaptive updates using subspace learning approaches [28], which will be explored in the future to develop fast algorithms for solving (7). One main issue of the proposed recovery formulation (7) is that the L1-norm regularization may penalize the magnitude of nonzero entries of D. Hence, the solu-tionD tends to be smaller in magnitude than the actual values and is thus biased.…”
Section: Spatio-temporal Load Recoverymentioning
confidence: 99%
“…Certain conditions on K and D are required in order to achieve accurate RPCA results [9,28,29]. Loosely speaking, the low rank component K cannot be sparse and the sparse component D cannot be of low rank.…”
Section: Recovery Performancementioning
confidence: 99%
“…The harder problem of seeking a PCA effective for outliercorruped data is called robust PCA (RPCA). There has been no mathematically precise meaning for the term "outlier" [24]. Thus multiple methods have been attempted to define or quantify this term, such as alternating minimization [14], random sampling techniques [9,17], multivariate trimming [11], and so on [7,27].…”
Section: Introductionmentioning
confidence: 99%
“…The low-rank model corresponds to the background, and anomalies are outside of the low-rank subspace [8]- [12]. In Signal Processing [13]- [15], there is much related work on robust principal component analysis (RPCA) and subspace tracking. Extensive literature surveys can be found in [16]- [18].…”
Section: Introductionmentioning
confidence: 99%