2021
DOI: 10.1137/20m1315294
|View full text |Cite
|
Sign up to set email alerts
|

Rank $2r$ Iterative Least Squares: Efficient Recovery of Ill-Conditioned Low Rank Matrices from Few Entries

Abstract: We present a new, simple, and computationally efficient iterative method for low rank matrix completion. Our method is inspired by the class of factorization-type iterative algorithms, but substantially differs from them in the way the problem is cast. Precisely, given a target rank r, instead of optimizing on the manifold of rank r matrices, we allow our interim estimated matrix to have a specific overparametrized rank 2r structure. Our algorithm, denoted R2RILS, for rank 2r iterative least squares, thus has … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 41 publications
0
5
0
Order By: Relevance
“…Various methods were proposed, including the convex relaxation (Mu et al, 2014;Raskutti et al, 2019;Tomioka et al, 2011), projected gradient descent (Rauhut et al, 2017;Chen et al, 2019a;Ahmed et al, 2020;Yu and Liu, 2016), gradient descent on the factorized model (Han et al, 2020b;Cai et al, 2019;Hao et al, 2020), alternating minimization (Zhou et al, 2013;Jain and Oh, 2014;Liu and Moitra, 2020;Xia et al, 2020), and importance sketching (Zhang et al, 2020a). Moreover, when the target tensor has order two, our problem reduces to the widely studied low-rank matrix recovery/estimation (Recht et al, 2010;Li et al, 2019;Ma et al, 2019;Sun and Luo, 2015;Tu et al, 2016;Wang et al, 2017;Zhao et al, 2015;Zheng and Lafferty, 2015;Charisopoulos et al, 2021;Luo et al, 2020;Bauch et al, 2021). The readers are referred to a recent survey in Chi et al (2019).…”
Section: Related Literaturementioning
confidence: 99%
“…Various methods were proposed, including the convex relaxation (Mu et al, 2014;Raskutti et al, 2019;Tomioka et al, 2011), projected gradient descent (Rauhut et al, 2017;Chen et al, 2019a;Ahmed et al, 2020;Yu and Liu, 2016), gradient descent on the factorized model (Han et al, 2020b;Cai et al, 2019;Hao et al, 2020), alternating minimization (Zhou et al, 2013;Jain and Oh, 2014;Liu and Moitra, 2020;Xia et al, 2020), and importance sketching (Zhang et al, 2020a). Moreover, when the target tensor has order two, our problem reduces to the widely studied low-rank matrix recovery/estimation (Recht et al, 2010;Li et al, 2019;Ma et al, 2019;Sun and Luo, 2015;Tu et al, 2016;Wang et al, 2017;Zhao et al, 2015;Zheng and Lafferty, 2015;Charisopoulos et al, 2021;Luo et al, 2020;Bauch et al, 2021). The readers are referred to a recent survey in Chi et al (2019).…”
Section: Related Literaturementioning
confidence: 99%
“…The second method related to GNMR is the R2RILS algorithm [BNZ21]. Given an estimate (U t , V t ), the first step of R2RILS computes the minimal norm solution ( Ũ , Ṽ ) of (7a) as in the averaging variant of GNMR.…”
Section: Basin Of Attraction Error Decay Ratementioning
confidence: 99%
“…Most matrix completion methods proposed thus far suffer from two limitations: they may fail to recover the underlying matrix X * when X * is even mildly ill-conditioned, or the number of observed entries m is relatively small [TW13, BNZ21,KV20]. These may pose a significant drawback in practical applications.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…First, from an algorithmic perspective, a number of algorithms, including the penalty approaches, gradient descent, alternating minimization, Gauss-Newton, have been developed either for solving the manifold formulation [BPS20, GS10, BA11, MMBS14, MBS11, MMBS14, Van13, HH18, LHLZ20] or the factorization formulation [CLS15, JNS13, SL15, TD21, TBS `16, WYZ12,BNZ21]. We refer readers to [CLC19,CW18a] for the recent algorithmic development under two formulations.…”
Section: Related Literaturementioning
confidence: 99%