2021
DOI: 10.48550/arxiv.2111.05522
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

On the Convergence of Orthogonal/Vector AMP: Long-Memory Message-Passing Strategy

Abstract: Orthogonal/vector approximate message-passing (AMP) is a powerful message-passing (MP) algorithm for signal reconstruction in compressed sensing. This paper proves the convergence of Bayes-optimal orthogonal/vector AMP in the large system limit. The proof strategy is based on a novel long-memory (LM) MP approach: A first step is a construction of LM-MP that is guaranteed to converge in principle. A second step is a large-system analysis of LM-MP via an existing framework of state evolution. A third step is to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(10 citation statements)
references
References 44 publications
0
10
0
Order By: Relevance
“…A distinguished feature of the AMP-type algorithms [3], [12], [13], [19], [20], [22]- [25], [27], [28] is that their dynamics can be rigorously described by state evolution [4], [13], [14], [21], [26]. However, state evolution does not necessarily guarantee the convergence of iterative algorithms [44]- [48]. Therefore, it is desired to find a new technique or framework that ensures the convergence of the AMP-type algorithms.…”
Section: B Motivation and Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…A distinguished feature of the AMP-type algorithms [3], [12], [13], [19], [20], [22]- [25], [27], [28] is that their dynamics can be rigorously described by state evolution [4], [13], [14], [21], [26]. However, state evolution does not necessarily guarantee the convergence of iterative algorithms [44]- [48]. Therefore, it is desired to find a new technique or framework that ensures the convergence of the AMP-type algorithms.…”
Section: B Motivation and Related Workmentioning
confidence: 99%
“…In [22], [23], the authors first proposed an analytically optimized vector damping for Bayes-optimal MAMP (BO-MAMP) based on the current and all the preceding messages, which not only solves the convergence problem of MAMP, but also preserve the orthogonality and Gaussianity, i.e., the dynamics of damped BO-MAMP can be correctly described by state evolution. Recently, the damping optimization in [22], [23] was used to analyze the convergence of Bayes-optimal OAMP/VAMP in [44] from a sufficient statistic perspective. The works in [22], [23], [44] pave the way for a novel principle to solve the convergence of the AMP-type algorithms.…”
Section: B Motivation and Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…RI-GAMP vs. Vector AMP. Vector AMP (VAMP) is an iterative algorithm (based on Expectation Propagation) recently proposed for estimation in rotationally invariant linear [RSF19,Tak20,Tak21b] and generalized linear models [SRF16, PSAR + 20]. Like RI-GAMP, VAMP can be tailored to take advantage of prior information about the signal and its performance can be characterized by a state evolution recursion.…”
Section: Introductionmentioning
confidence: 99%