2019
DOI: 10.1007/s10957-019-01477-z
|View full text |Cite
|
Sign up to set email alerts
|

An Envelope for Davis–Yin Splitting and Strict Saddle-Point Avoidance

Abstract: It is known that operator splitting methods based on Forward Backward Splitting (FBS), Douglas-Rachford Splitting (DRS), and Davis-Yin Splitting (DYS) decompose a difficult optimization problems into simpler subproblem under proper convexity and smoothness assumptions. In this paper, we identify an envelope (an objective function) whose gradient descent iteration under a variable metric coincides with DYS iteration. This result generalizes the Moreau envelope for proximal-point iteration and the envelopes for … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 15 publications
(10 citation statements)
references
References 25 publications
0
10
0
Order By: Relevance
“…In future work we plan to address the following issues: (i) reducing the working assumptions by also accounting for boundary points, (ii) extending existing superlinear direction schemes such as those proposed in [48,85,3,81] for either convex or smooth problems to the more general setting of this paper, (iii) assessing the performance of such schemes in the Bella framework with numerical simulations on nonconvex nonsmooth problems such as low-rank matrix completion, sparse nonnegative matrix factorization, phase retrieval, and deep learning, and (iv) guaranteeing saddle point avoidance, in the spirit of [67,46,52].…”
Section: 41mentioning
confidence: 99%
“…In future work we plan to address the following issues: (i) reducing the working assumptions by also accounting for boundary points, (ii) extending existing superlinear direction schemes such as those proposed in [48,85,3,81] for either convex or smooth problems to the more general setting of this paper, (iii) assessing the performance of such schemes in the Bella framework with numerical simulations on nonconvex nonsmooth problems such as low-rank matrix completion, sparse nonnegative matrix factorization, phase retrieval, and deep learning, and (iv) guaranteeing saddle point avoidance, in the spirit of [67,46,52].…”
Section: 41mentioning
confidence: 99%
“…Existing work on TOS applied to nonconvex problems limits itself to the setting where at least two terms in (1) have Lipschitz continuous gradients. Under this assumption, Liu & Yin (2019) identify an envelope function for TOS, which permits one to interpret TOS as gradient descent for this envelope under a variable metric. Their envelope generalizes the well-known Moreau envelope as well as the envelopes for Douglas-Rachford and Forward-Backward splitting introduced in (Patrinos et al, 2014) and (Themelis et al, 2018).…”
Section: Related Workmentioning
confidence: 99%
“…A recent preprint [15] approaches the problem of finding local minima using the forward-backward envelope technique developed in [19], where the assumption about the smoothness of objective function is weakened to local smoothness instead of global smoothness.…”
Section: Related Literaturementioning
confidence: 99%