2020
DOI: 10.1137/19m1237569
|View full text |Cite
|
Sign up to set email alerts
|

Geometry and Symmetry in Short-and-Sparse Deconvolution

Abstract: We study the Short-and-Sparse (SaS) deconvolution problem of recovering a short signal a0 and a sparse signal x0 from their convolution. We propose a method based on nonconvex optimization, which under certain conditions recovers the target short and sparse signals, up to a signed shift symmetry which is intrinsic to this model. This symmetry plays a central role in shaping the optimization landscape for deconvolution. We give a regional analysis, which characterizes this landscape geometrically, on a union of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
45
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 25 publications
(46 citation statements)
references
References 46 publications
1
45
0
Order By: Relevance
“…On the other hand, we believe the probability tools of decoupling and measure concentration we developed here can form a solid foundation for studying other nonconvex problems under the random convolutional model. Those problems include blind calibration [80][81][82], sparse blind deconvolution [76,[83][84][85][86][87][88][89][90][91][92][93], and convolutional dictionary learning [16,[94][95][96][97]].…”
Section: Geometric Analysis and Global Resultmentioning
confidence: 99%
“…On the other hand, we believe the probability tools of decoupling and measure concentration we developed here can form a solid foundation for studying other nonconvex problems under the random convolutional model. Those problems include blind calibration [80][81][82], sparse blind deconvolution [76,[83][84][85][86][87][88][89][90][91][92][93], and convolutional dictionary learning [16,[94][95][96][97]].…”
Section: Geometric Analysis and Global Resultmentioning
confidence: 99%
“…Other algorithms for multi-channel blind deconvolution include sparse spectral methods [21] and nonconvex regularization [22]. A different model called "sparse-and-short" deconvolution is studied in [6,23]. The work of Li and Bresler [4] is the most related to ours, which considered the same problem by running perturbed manifold gradient descent with a random initialization, over a spherically constrained loss function based on ℓ 4 norm maximization.…”
Section: Related Workmentioning
confidence: 99%
“…When the filter is unknown, it leads to the so-called blind deconvolution problem. This problem is ill-posed without extra assumptions on the filter, since the number of unknowns is much larger than the number of observations [4,5,6,7,8]. Luckily, in many situations, one can make multiple observations sharing the same filter, but with diverse sparse inputs, either spatially or temporally, thanks to the advances of sensing technologies.…”
Section: Introductionmentioning
confidence: 99%
“…They assume that there is a single measurement of the form y " a f x, that x is sparse and the length of the kernel a is much shorter than y and x. In particular, Zhang et al [ZKW18] formulated the problem as an ℓ 4 -maximization problem over the sphere similar to [LB18], proving on a local region that every local minimizer is near a truncated signed shift of a. Kuo et al [KLZW19] studied a dropped quadratic simplification of bilinear Lasso objective [LQK`19], which provably obtains exact recovery for an incoherent kernel a and sparse x. However, as the kernel and measurements are not the same length in SaS, the SaS deconvolution is much harder than MCS-BD: the problem has spurious local minimizers such as shift-truncations, so that most of the results there can only show benign local geometry structure regardless of the choice of objectives.…”
Section: Short-and-sparse Deconvolutionmentioning
confidence: 99%