2019
DOI: 10.1137/18m1227846
|View full text |Cite
|
Sign up to set email alerts
|

Matrix Rigidity and the Ill-Posedness of Robust PCA and Matrix Completion

Abstract: Robust Principal Component Analysis (PCA) (Candès et al., 2011) and low-rank matrix completion (Recht et al., 2010) are extensions of PCA that allow for outliers and missing entries respectively. It is well-known that solving these problems requires a low coherence between the low-rank matrix and the canonical basis, since in the extreme cases -when the low-rank matrix we wish to recover is also sparse -there is an inherent ambiguity. However, the well-posedness issue in both problems is an even more fundament… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
1
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 48 publications
0
5
0
Order By: Relevance
“…Fig. 1 shows the recovery performance of the convex relaxation (8) and Algorithm 1 with different d. We can see that Algorithm 1 performs better than the convex relaxation under the same d, and the performance of Algorithm 1 improves when d increases. When d = 5, the average running time of Algorithm 1 is 0.35 seconds, while the convex relaxation using CVX needs 300 seconds.…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…Fig. 1 shows the recovery performance of the convex relaxation (8) and Algorithm 1 with different d. We can see that Algorithm 1 performs better than the convex relaxation under the same d, and the performance of Algorithm 1 improves when d increases. When d = 5, the average running time of Algorithm 1 is 0.35 seconds, while the convex relaxation using CVX needs 300 seconds.…”
Section: Resultsmentioning
confidence: 99%
“…Fig. 2 shows the comparison of Algorithm 1 with the locally weighted matrix smoothing (LOWEMS) proposed in [21], and the convex relaxation (8). LOWEMS alternatively minimizes U and V in the objective function of equation ( 13) in [21] over U and V .…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…Our contributions extend existing results on restricted isometry constants (RIC) for Gaussian and other measurement operators for sparse vectors [25] or low-rank matrices [11] to the sets of low-rank plus sparse matrices. For the set of matrices which are the sum of a low-rank plus a sparse matrix the results differ subtly due to the space not being closed, in that there are matrices X for which there does not exist a nearest projection to the set of low-rank plus sparse matrices [26]. To overcome this, we introduce the set of low-rank plus sparse matrices with a constraint on the Frobenius norm of its low-rank component, see Definition 1.1.…”
Section: Introductionmentioning
confidence: 99%