2019
DOI: 10.1109/tit.2019.2893254
|View full text |Cite
|
Sign up to set email alerts
|

Optimization-Based AMP for Phase Retrieval: The Impact of Initialization and $\ell_{2}$ Regularization

Abstract: We consider an 2-regularized non-convex optimization problem for recovering signals from their noisy phaseless observations. We design and study the performance of a message passing algorithm that aims to solve this optimization problem. We consider the asymptotic setting m, n → ∞, m/n → δ and obtain sharp performance bounds, where m is the number of measurements and n is the signal dimension. We show that for complex signals the algorithm can perform accurate recovery with only m = 64 π 2 − 4 n ≈ 2.5n measure… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
29
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 51 publications
(30 citation statements)
references
References 66 publications
1
29
0
Order By: Relevance
“…Recall that E[Y ] = 2x x + x 2 2 I n has a bounded spectral norm, then (136) implies that, to keep the deviation between Y and E[Y ] well-controlled, we must at least have (n log m)/m 1.…”
Section: Truncated Spectral Methods For Sample Efficiencymentioning
confidence: 99%
See 2 more Smart Citations
“…Recall that E[Y ] = 2x x + x 2 2 I n has a bounded spectral norm, then (136) implies that, to keep the deviation between Y and E[Y ] well-controlled, we must at least have (n log m)/m 1.…”
Section: Truncated Spectral Methods For Sample Efficiencymentioning
confidence: 99%
“…A few other nonconvex matrix factorization algorithms have been left out due to space, including but not limited to normalized iterative hard thresholding (NIHT) [129], atomic decomposition for minimum rank approximation (Admira) [130], composite optimization (e.g. prox-linear algorithm) [131][132][133][134], approximate message passing [135][136][137], block coordinate descent [19], coordinate descent [138], and conjugate gradient [139]. The readers are referred to these papers for detailed descriptions.…”
Section: Further Pointers To Other Algorithmsmentioning
confidence: 99%
See 1 more Smart Citation
“…Almost all of these nonconvex methods require carefully-designed initialization to guarantee a sufficiently accurate initial point. One exception is the approximate message passing algorithm proposed in [MXM18], which works as long as the correlation between the truth and the initial signal is bounded away from zero. This, however, does not accommodate the case when the initial signal strength is vanishingly small (like random initialization).…”
Section: Related Workmentioning
confidence: 99%
“…An active line of recent work studies nonconvex optimization algorithms for solving the classical phase retrieval problem (see, e.g., [1]- [8]). Compared to methods using convex relaxation [9]- [12], the nonconvex approaches tend to require much lower computational complexity and memory footprints.…”
Section: Introductionmentioning
confidence: 99%