2020
DOI: 10.1137/19m1304854
|View full text |Cite
|
Sign up to set email alerts
|

Operator Splitting Performance Estimation: Tight Contraction Factors and Optimal Parameter Selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
37
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
2

Relationship

2
7

Authors

Journals

citations
Cited by 43 publications
(37 citation statements)
references
References 52 publications
0
37
0
Order By: Relevance
“…The authors of [35] introduced tightness guarantees for smooth (strongly) convex optimization, and for larger classes of problems in [36] (where a list of sufficient conditions for applying the methodology is provided). It was also used to deal with nonsmooth problems [11,36], monotone inclusions and variational inequalities [16,17,21,31], and even to study fixed-point iterations of non-expansive operators [25]. Fixed-step gradient descent was among the first algorithms to be studied with this methodology in different settings: for (possibly composite) smooth (possibly strongly) convex optimization [14,15,35,36], and its line-search version was studied using the same methodology in [22].…”
mentioning
confidence: 99%
“…The authors of [35] introduced tightness guarantees for smooth (strongly) convex optimization, and for larger classes of problems in [36] (where a list of sufficient conditions for applying the methodology is provided). It was also used to deal with nonsmooth problems [11,36], monotone inclusions and variational inequalities [16,17,21,31], and even to study fixed-point iterations of non-expansive operators [25]. Fixed-step gradient descent was among the first algorithms to be studied with this methodology in different settings: for (possibly composite) smooth (possibly strongly) convex optimization [14,15,35,36], and its line-search version was studied using the same methodology in [22].…”
mentioning
confidence: 99%
“…Not only do these illustrations provide intuition. Indeed, it is a straightforward consequence of, e.g., [ 24 , 25 ] that for compositions of two operator classes that admit I-N decompositions, there always exists a 2D-worst case. Hence, if the 2D illustration implies that the composition class admits a specific -I-N decomposition, so does the full operator class.…”
Section: Graphical Characterizationsmentioning
confidence: 99%
“…Operator regression is a recent and at the same time old topic. We are going to build on the recent work [41] and the F.A. Valentine's 1945 paper [42].…”
Section: Related Workmentioning
confidence: 99%