2018
DOI: 10.1007/s10957-018-1328-z
|View full text |Cite
|
Sign up to set email alerts
|

Envelope Functions: Unifications and Further Properties

Abstract: Forward-backward and Douglas-Rachford splitting are methods for structured nonsmooth optimization. With the aim to use smooth optimization techniques for nonsmooth problems, the forward-backward and Douglas-Rachford envelopes where recently proposed. Under specific problem assumptions, these envelope functions have favorable smoothness and convexity properties and their stationary points coincide with the fixed-points of the underlying algorithm operators. This allows for solving such nonsmooth optimization pr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 32 publications
0
5
0
Order By: Relevance
“…Lemma 6. In problem (10), suppose that f is L f -smooth and g is proper, convex, and lsc. Then, for every…”
Section: A Connections With the Forward-backward Envelopementioning
confidence: 99%
See 1 more Smart Citation
“…Lemma 6. In problem (10), suppose that f is L f -smooth and g is proper, convex, and lsc. Then, for every…”
Section: A Connections With the Forward-backward Envelopementioning
confidence: 99%
“…Most related to our approach, [4] analyzes a Gauss-Seidel-type FBS in the spirit of the PALM algorithm [7], and [16] exploits the interpretation of FBS as a gradient-type algorithm on the forwardbackward envelope (FBE) [17,22] to develop quasi-Newton methods for the nonsmooth and nonconvex problem (2). The gradient interpretation of splitting schemes originated in [20] with the proximal point algorithm and has recently been extended to several other schemes [10,17,18,23]. In this work we undertake a converse direction: first we design a smooth surrogate of the nonsmooth DC function in (P), and then derive a novel splitting algorithm from its gradient steps.…”
Section: Introductionmentioning
confidence: 99%
“…x k`1|k´x ď ζ P x k´xk`1|k (22) x k`1´xk`1 ď ζ C x k`1|k´xk`1 (23) and therefore the last thing to do is to combine them to derive a bound for the error x k`1´xk`1 . Following the same steps of [14, Appendix B] from inequalities (21), (22) and the results above it is possible to compute…”
Section: A3 Overall Error Boundmentioning
confidence: 99%
“…Remark 4 The recent work [22] proved that if ϕ is convex quadratic, then the FBE is strongly convex and smooth, and notice that this is exactly the case of h k in the prediction step. Therefore we can minimize the FBE at the prediction step using a Newton method with BFGS scheme, without the need for the line search that requires a larger number of iterations.…”
Section: Forward-backward Envelopementioning
confidence: 99%
See 1 more Smart Citation