2021
DOI: 10.1109/tit.2020.3025272
|View full text |Cite
|
Sign up to set email alerts
|

Algorithmic Analysis and Statistical Estimation of SLOPE via Approximate Message Passing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
24
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 28 publications
(24 citation statements)
references
References 32 publications
0
24
0
Order By: Relevance
“…This state-evolution characterization made AMP amenable as a analysis device to describe the statistical behaviors for various problems, despite that AMP is an effective algorithm on its own. The AMP algorithm and machinery has been successfully applied to a variety of problems beyond compressed sensing, including but not limited to robust M-estimators (Donoho and Montanari, 2016), SLOPE (Bu et al, 2020), low-rank matrix estimation and PCA (Rangan and Fletcher, 2012;Montanari and Venkataramanan, 2021;Fan, 2020;Zhong et al, 2021), stochastic block models (Deshpande et al, 2015), phase retrieval (Ma et al, 2018), phase synchronization (Celentano et al, 2021), and generalized linear models (Rangan, 2011;Barbier et al, 2019). See Feng et al (2021) for an accessible introduction of this machinery and its applications.…”
Section: Approximate Message Passingmentioning
confidence: 99%
“…This state-evolution characterization made AMP amenable as a analysis device to describe the statistical behaviors for various problems, despite that AMP is an effective algorithm on its own. The AMP algorithm and machinery has been successfully applied to a variety of problems beyond compressed sensing, including but not limited to robust M-estimators (Donoho and Montanari, 2016), SLOPE (Bu et al, 2020), low-rank matrix estimation and PCA (Rangan and Fletcher, 2012;Montanari and Venkataramanan, 2021;Fan, 2020;Zhong et al, 2021), stochastic block models (Deshpande et al, 2015), phase retrieval (Ma et al, 2018), phase synchronization (Celentano et al, 2021), and generalized linear models (Rangan, 2011;Barbier et al, 2019). See Feng et al (2021) for an accessible introduction of this machinery and its applications.…”
Section: Approximate Message Passingmentioning
confidence: 99%
“…Sharp asymptotic predictions have only more recently appeared in the literature for the case of noisy linear measurements with Gaussian measurement vectors. There are by now three different approaches that have been used towards asymptotic analysis of convex regularized estimators: (i) the one that is based on the approximate message passing (AMP) algorithm and its state-evolution analysis (e.g., [5,8,14,20,[32][33][34]); (ii) the one that is based on Gaussian process (GP) inequalities, specifically on the convex Gaussian min-max Theorem (CGMT) (e.g., [9,10,13,15,18,19]); and (iii) the "leave-one-out" approach [11,35]. The three approaches are quite different to each other and each comes with its unique distinguishing features and disadvantages.…”
Section: Related Workmentioning
confidence: 99%
“…Before stating the result, we provide some intuition about the proof which builds on Theorem 2. The critical observation in the proof of Theorem 2 is that the effective noise σ of is minimized (i.e., it attains the value σ opt ) if the Cauchy-Schwartz inequality in (20) holds with equality. Hence, we seek = opt so that for some c ∈ R,…”
Section: On the Optimal Loss Functionmentioning
confidence: 99%
See 1 more Smart Citation
“…This method relies on the computation of the proximal operator for the sorted l 1 norm. This operator is also a very important tool for the development of the approximate message passing theory (Bu et al, 2020;Zhang and Bu, 2021).…”
Section: Introductionmentioning
confidence: 99%