2021
DOI: 10.1088/1361-6420/abd29c
|View full text |Cite
|
Sign up to set email alerts
|

A fast homotopy algorithm for gridless sparse recovery

Abstract: In this paper, we study the solving of the gridless sparse optimization problem and its application to 3D image deconvolution. Based on the recent works of [14] introducing the Sliding Frank-Wolfe algorithm to solve the Beurling LASSO problem, we introduce an accelerated algorithm, denoted BSFW, that preserves its convergence properties, while removing most of the costly local descents. Besides, as the solving of BLASSO still relies on a regularization parameter, we introduce a homotopy algorithm to solve the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(7 citation statements)
references
References 33 publications
0
7
0
Order By: Relevance
“…The study of optimization problems over the space of Radon measures can be traced back to the pioneering works of Beurling [22], where Fourier-domain measurements were also considered. In the early 2010s, the works of De Castro and Gamboa [2], Candès and Fernandez-Granda [3,23], and Bredies and Pikkarainen [4] considered optimization tasks of the form (3) (or its penalized version), with both theoretical analyses and novel algorithmic approaches to recover a sparse-measure solution, in the continuum [4,7,8,10,[24][25][26][27]. The existence of sparse-measure solutions, i.e., solutions of the form K k=1 a k δ x k , where K ∈ N * , a k ∈ R, and δ x k is the Dirac mass at the location x k , seems to have been proven for the first time in [22] and was later improved by Fisher and Jerome in [28].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…The study of optimization problems over the space of Radon measures can be traced back to the pioneering works of Beurling [22], where Fourier-domain measurements were also considered. In the early 2010s, the works of De Castro and Gamboa [2], Candès and Fernandez-Granda [3,23], and Bredies and Pikkarainen [4] considered optimization tasks of the form (3) (or its penalized version), with both theoretical analyses and novel algorithmic approaches to recover a sparse-measure solution, in the continuum [4,7,8,10,[24][25][26][27]. The existence of sparse-measure solutions, i.e., solutions of the form K k=1 a k δ x k , where K ∈ N * , a k ∈ R, and δ x k is the Dirac mass at the location x k , seems to have been proven for the first time in [22] and was later improved by Fisher and Jerome in [28].…”
Section: Related Workmentioning
confidence: 99%
“…They provide a general framework for the recovery of sparse continuous-domain signals (e.g., Dirac streams or splines) from possibly corrupted finite-dimensional measurements. Such techniques rely on solid theoretical foundations [2][3][4][5][6], but also on many algorithmic advances [4,7,8], and have found various data-science applications [7,[9][10][11]. It is well known that their discrete-domain counterparts (i.e., 1 regularization methods) lead to variational problems whose solutions are not necessarily unique [12].…”
Section: Introductionmentioning
confidence: 99%
“…Setting the regularization parameter is known to be a difficult problem. To help this tuning, the homotopy approach can be used 33 : results of the algorithm for several parameters λ can be obtained by sequentially running the SFW algorithm for decreasing λ, initializing each run with the output of the previous run. In this case, the algorithm starts by solving (15) and checking the value of η, to avoid addition of a source if not necessary.…”
Section: B Setting the Regularization Parameter λmentioning
confidence: 99%
“…The basic scheme of conditional gradient methods [5, 17,18,29,3,16] for (1.1) is to add a single Dirac measure (or spike) to the discrete measure 𝜇, and then optimise the weights of the spikes so far inserted. Repeat.…”
Section: Introductionmentioning
confidence: 99%