2019
DOI: 10.1109/tit.2019.2902924
|View full text |Cite
|
Sign up to set email alerts
|

Sample-Efficient Algorithms for Recovering Structured Signals From Magnitude-Only Measurements

Abstract: We consider the problem of recovering a signal x * ∈ R n , from magnitude-only measurements, yi = | ai, x * | for i = {1, 2, . . . , m}. This is a stylized version of the classical phase retrieval problem, and is a fundamental challenge in nano-and bio-imaging systems, astronomical imaging, and speech processing. It is well known that the above problem is ill-posed, and therefore some additional assumptions on the signal and/or the measurements are necessary.In this paper, we consider the case where the underl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
72
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 39 publications
(72 citation statements)
references
References 68 publications
0
72
0
Order By: Relevance
“…In Table III, we demonstrate this via a simple experiment. We compare AltMinLowRaP with the most recent provable sparse PR algorithm [18], CoPRAM, applied with assuming wavelet sparsity (which is a generic choice for any piecewise smooth image, but is not necessarily the best choice for the particular image). As can be seen, AltMinLowRaP has significantly superior performance not just for the real image sequence, but also for its deliberately sparsified version.…”
Section: A Low Rank Pr (Lrpr) Problem Setting and Notationmentioning
confidence: 99%
See 1 more Smart Citation
“…In Table III, we demonstrate this via a simple experiment. We compare AltMinLowRaP with the most recent provable sparse PR algorithm [18], CoPRAM, applied with assuming wavelet sparsity (which is a generic choice for any piecewise smooth image, but is not necessarily the best choice for the particular image). As can be seen, AltMinLowRaP has significantly superior performance not just for the real image sequence, but also for its deliberately sparsified version.…”
Section: A Low Rank Pr (Lrpr) Problem Setting and Notationmentioning
confidence: 99%
“…X * has rank r LRMC (first) [25] No left & right incoherence, m ≥ C(n/q)r 4.5 log(1/ ) C(n/q)r 6.5 log n log 2 (1/ ) X * has rank r LRMC (best) [33] No left & right incoherence m ≥ C(n/q)r 2 log 2 n log 2 (1/ ) C(n/q)r 3 log n log 2 (1/ ) X * has rank r Sparse PR (first) [6] Yes x * is s-sparse in canonical basis, m ≥ Cs 2 log n log(1/ ) Cns 3 log n log 2 (1/ ) min nonzero entry lower bounded Sparse PR (best) [17], [18] Yes x * is s-sparse in canonical basis m ≥ Cs 2 log n Cns 2 log n log(1/ )…”
Section: Problemmentioning
confidence: 99%
“…• Refinement: We then sequentially refine this estimate using a strategy based on alternating minimization, following a variant of [10,14,15]. We prove that our refinement strategy demonstrates linear convergence to x * , i.e., for t = 0, 1, 2, .…”
Section: Algorithmmentioning
confidence: 95%
“…A proof will be provided in an extended version of this paper. The proof follows an adaptation of the approach in [14,15], which itself is based on the approach of [17]. Lemma 4.2.…”
Section: Proof Sketchmentioning
confidence: 99%
“…Usually, s is small compared to n in the sparse phase retrieval problem, which makes possible that (2) requires much fewer measurements than n for a successful recovery. Indeed, practical algorithms such as ℓ 1 -regularized PhaseLift method [25], sparse AltMin [31], thresholding/projected Wirtinger flow and its variants [9,33], SPARTA [43], CoPRAM [24], and HTP [7], just name a few, can recover the sparse signal successfully from (2) with high probability when m ∼ O(s 2 log n) Gaussian random sensing vectors are used. Most practical sparse phase retrieval algorithms are extensions of corresponding approaches for the general phase retrieval problem (1) to the sparse setting (2).…”
Section: Introductionmentioning
confidence: 99%