2010
DOI: 10.1007/s11228-010-0147-7
|View full text |Cite
|
Sign up to set email alerts
|

Dualization of Signal Recovery Problems

Abstract: In convex optimization, duality theory can sometimes lead to simpler solution methods than those resulting from direct primal analysis. In this paper, this principle is applied to a class of composite variational problems arising in particular in signal recovery. These problems are not easily amenable to solution by current methods but they feature Fenchel-Moreau-Rockafellar dual problems that can be solved by forward-backward splitting. The proposed algorithm produces simultaneously a sequence converging weak… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
72
0

Year Published

2011
2011
2021
2021

Publication Types

Select...
4
4

Relationship

1
7

Authors

Journals

citations
Cited by 83 publications
(72 citation statements)
references
References 65 publications
0
72
0
Order By: Relevance
“…In the following we provide an analysis of the notion of inexactness given in Definition 2.1, which will clarify the nature of these approximations and the scope of applicability. To this purpose, we will make use of the duality technique, an approach that is quite common in signal recovery and image processing applications [18,12,16]. The starting point is the Moreau decomposition formula [41,18] prox λg (y) = y − λprox g * /λ (y/λ), (2.5) where g * : H → R is the conjugate functional of g defined as g * (y) = sup x∈H ( x, y − g(x)).…”
Section: Main Contributionsmentioning
confidence: 99%
See 2 more Smart Citations
“…In the following we provide an analysis of the notion of inexactness given in Definition 2.1, which will clarify the nature of these approximations and the scope of applicability. To this purpose, we will make use of the duality technique, an approach that is quite common in signal recovery and image processing applications [18,12,16]. The starting point is the Moreau decomposition formula [41,18] prox λg (y) = y − λprox g * /λ (y/λ), (2.5) where g * : H → R is the conjugate functional of g defined as g * (y) = sup x∈H ( x, y − g(x)).…”
Section: Main Contributionsmentioning
confidence: 99%
“…convex function. The structure (2.7) often arises in regularization for ill-posed inverse problems [13,10,28,53,67,16]. By definition, finding prox λg (y) requires the solution of the minimization problem…”
Section: Main Contributionsmentioning
confidence: 99%
See 1 more Smart Citation
“…Note that, if A is the identity matrix, one recovers the usual proximity operator prox f : R N → R N , which is at the core of numerous convex optimization algorithms (see [33,34,35] for tutorials and use for multicomponent image processing). 1 We are now ready to provide Algorithm 1 for the minimization of function F :…”
Section: Minimization Strategymentioning
confidence: 99%
“…Note that the dual forward-backward algorithm proposed in [54] is not directly applicable to Problem (2.8) since the projection onto C is not explicit in general. Dykstra's algorithm [55] to compute the projection onto an intersection of convex sets would not be applicable either, as the matrix DΓD is usually singular.…”
Section: Convex Analysis Toolsmentioning
confidence: 99%