2021
DOI: 10.48550/arxiv.2105.02180
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A unifying tutorial on Approximate Message Passing

Abstract: Over the last decade or so, Approximate Message Passing (AMP) algorithms have become extremely popular in various structured high-dimensional statistical problems. The fact that the origins of these techniques can be traced back to notions of belief propagation in the statistical physics literature lends a certain mystique to the area for many statisticians. Our goal in this work is to present the main ideas of AMP from a statistical perspective, to illustrate the power and flexibility of the AMP framework. Al… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 96 publications
0
4
0
Order By: Relevance
“…The AMP algorithm and machinery has been successfully applied to a variety of problems beyond compressed sensing, including but not limited to robust M-estimators (Donoho and Montanari, 2016), SLOPE (Bu et al, 2020), low-rank matrix estimation and PCA (Rangan and Fletcher, 2012;Montanari and Venkataramanan, 2021;Fan, 2020;Zhong et al, 2021), stochastic block models (Deshpande et al, 2015), phase retrieval (Ma et al, 2018), phase synchronization (Celentano et al, 2021), and generalized linear models (Rangan, 2011;Barbier et al, 2019). See Feng et al (2021) for an accessible introduction of this machinery and its applications. Moreover, a dominant fraction of the AMP works focused on high-dimensional asymptotics (so that the problem dimension tends to infinity first before the number of iterations), except for Rush and Venkataramanan (2018) that derived finite-sample guarantees allowing the number of iterations to grow up to O(log n/ log log n).…”
Section: Approximate Message Passingmentioning
confidence: 99%
“…The AMP algorithm and machinery has been successfully applied to a variety of problems beyond compressed sensing, including but not limited to robust M-estimators (Donoho and Montanari, 2016), SLOPE (Bu et al, 2020), low-rank matrix estimation and PCA (Rangan and Fletcher, 2012;Montanari and Venkataramanan, 2021;Fan, 2020;Zhong et al, 2021), stochastic block models (Deshpande et al, 2015), phase retrieval (Ma et al, 2018), phase synchronization (Celentano et al, 2021), and generalized linear models (Rangan, 2011;Barbier et al, 2019). See Feng et al (2021) for an accessible introduction of this machinery and its applications. Moreover, a dominant fraction of the AMP works focused on high-dimensional asymptotics (so that the problem dimension tends to infinity first before the number of iterations), except for Rush and Venkataramanan (2018) that derived finite-sample guarantees allowing the number of iterations to grow up to O(log n/ log log n).…”
Section: Approximate Message Passingmentioning
confidence: 99%
“…Approximate Message Passing (AMP) belongs to the family of iterative thresholding algorithms and was initially conceived for compressed sensing applications [21]. Since then, variations of the algorithm have been applied to various statistical estimation tasks such as machine learning, image processing, and communications [22]. Based on this, the authors in [13], [23] propose the LArge MIMO AMP (LAMA) algorithm which is shown in Algorithm 1.…”
Section: B Approximate Message Passingmentioning
confidence: 99%
“…First developed for Bayesian linear regression and compressed sensing in [26][27][28]35], they have since been applied to many high-dimensional problems arising in statistics and machine learning, including Lasso estimation and sparse linear regression [4,39], generalized linear models and phase retrieval [52,56,60], robust linear regression [24], sparse or structured principal components analysis (PCA) [22,23,44,53], group synchronization problems [51], deep learning [12,13,40] and optimization in spin glass models [1,32,42]. We refer to [30] for a recent review.…”
Section: Introductionmentioning
confidence: 99%