2021
DOI: 10.48550/arxiv.2103.04220
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Euclidean Representation of Low-Rank Matrices and Its Statistical Applications

Fangzheng Xie

Abstract: Low-rank matrices are pervasive throughout statistics, machine learning, signal processing, optimization, and applied mathematics. In this paper, we propose a novel and user-friendly Euclidean representation framework for low-rank matrices. Correspondingly, we establish a collection of technical and theoretical tools for analyzing the intrinsic perturbation of low-rank matrices in which the underlying referential matrix and the perturbed matrix both live on the same low-rank matrix manifold. Our analyses show … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 65 publications
0
1
0
Order By: Relevance
“…Spectral methods, which refer to a collection of tools and techniques informed by matrix analysis and eigendecompositions, underpin a number of methods used in high-dimensional multivariate statistics and data science, including but not limited to network analysis (Abbe et al, 2020a,b;Agterberg et al, 2020;Athreya et al, 2017;Cai et al, 2020;Jin et al, 2019;Lei, 2019;Lei and Rinaldo, 2015;Mao et al, 2020;Rubin-Delanchy et al, 2020;, principal components analysis, (Cai et al, 2020;Cai and Zhang, 2018;Koltchinskii et al, 2020;Koltchinskii and Lounici, 2017;Koltchinskii and Xia, 2016;Koltchinskii and Lounici, 2016;Wang and Fan, 2017;Johnstone and Lu, 2009;Lounici, 2013Lounici, , 2014Xie et al, 2019;Zhu et al, 2019), and spectral clustering (Abbe et al, 2020a,b;Amini and Razaee, 2021;Cai et al, 2020;Lei, 2019;Schiebinger et al, 2015;Srivastava et al, 2021). In addition, eigenvectors or related quantities can be used as a "warm start" for optimization methods (Chen et al ( , 2020; Chi et al (2019); Lu and Li (2017); ; Xie and Xu (2020); Xie (2021)), yielding provable convergence to quantities of interest provided the initialization is sufficiently close to the optimum. The model we consider includes as a submodel the high-dimensional K-component mixture model.…”
Section: Related Workmentioning
confidence: 99%
“…Spectral methods, which refer to a collection of tools and techniques informed by matrix analysis and eigendecompositions, underpin a number of methods used in high-dimensional multivariate statistics and data science, including but not limited to network analysis (Abbe et al, 2020a,b;Agterberg et al, 2020;Athreya et al, 2017;Cai et al, 2020;Jin et al, 2019;Lei, 2019;Lei and Rinaldo, 2015;Mao et al, 2020;Rubin-Delanchy et al, 2020;, principal components analysis, (Cai et al, 2020;Cai and Zhang, 2018;Koltchinskii et al, 2020;Koltchinskii and Lounici, 2017;Koltchinskii and Xia, 2016;Koltchinskii and Lounici, 2016;Wang and Fan, 2017;Johnstone and Lu, 2009;Lounici, 2013Lounici, , 2014Xie et al, 2019;Zhu et al, 2019), and spectral clustering (Abbe et al, 2020a,b;Amini and Razaee, 2021;Cai et al, 2020;Lei, 2019;Schiebinger et al, 2015;Srivastava et al, 2021). In addition, eigenvectors or related quantities can be used as a "warm start" for optimization methods (Chen et al ( , 2020; Chi et al (2019); Lu and Li (2017); ; Xie and Xu (2020); Xie (2021)), yielding provable convergence to quantities of interest provided the initialization is sufficiently close to the optimum. The model we consider includes as a submodel the high-dimensional K-component mixture model.…”
Section: Related Workmentioning
confidence: 99%