2020
DOI: 10.48550/arxiv.2006.06041
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Principled Analyses and Design of First-Order Methods with Inexact Proximal Operators

Abstract: Proximal operations are among the most common primitives appearing in both practical and theoretical (or high-level) optimization methods. This basic operation typically consists in solving an intermediary (hopefully simpler) optimization problem. In this work, we survey notions of inaccuracies that can be used when solving those intermediary optimization problems. Then, we show that worst-case guarantees for algorithms relying on such inexact proximal operations can be systematically obtained through a generi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(10 citation statements)
references
References 85 publications
(170 reference statements)
0
10
0
Order By: Relevance
“…The performance estimation problem (PEP) is a computer-assisted proof methodology that analyzes the worst-case performance of optimization algorithms through semidefinite programs (Drori and Teboulle, 2014;Taylor et al, 2017b,a). The use of the PEP has lead to many discoveries that would have otherwise been difficult without the assistance (Kim and Fessler, 2018a;Taylor et al, 2018;Taylor and Bach, 2019;Barré et al, 2020;De Klerk et al, 2020;Gu and Yang, 2020;Lieder, 2021;Ryu et al, 2020;Dragomir et al, 2021;Kim, 2021;Yoon and Ryu, 2021). Notably, the algorithms OGM (Drori and Teboulle, 2014;Kim andFessler, 2016, 2018b), OGM-G (Kim andFessler, 2021), andITEM (Taylor and were obtained by using the PEP for the setup of minimizing a smooth convex (possibly strongly convex) function.…”
Section: Preliminaries and Notationsmentioning
confidence: 99%
“…The performance estimation problem (PEP) is a computer-assisted proof methodology that analyzes the worst-case performance of optimization algorithms through semidefinite programs (Drori and Teboulle, 2014;Taylor et al, 2017b,a). The use of the PEP has lead to many discoveries that would have otherwise been difficult without the assistance (Kim and Fessler, 2018a;Taylor et al, 2018;Taylor and Bach, 2019;Barré et al, 2020;De Klerk et al, 2020;Gu and Yang, 2020;Lieder, 2021;Ryu et al, 2020;Dragomir et al, 2021;Kim, 2021;Yoon and Ryu, 2021). Notably, the algorithms OGM (Drori and Teboulle, 2014;Kim andFessler, 2016, 2018b), OGM-G (Kim andFessler, 2021), andITEM (Taylor and were obtained by using the PEP for the setup of minimizing a smooth convex (possibly strongly convex) function.…”
Section: Preliminaries and Notationsmentioning
confidence: 99%
“…When combined with restart techniques, improved rates for the uniformly-and/or strongly-convex case were also obtained in [3,9] (see also [12]). The A-HPE for strongly-convex problems was also recently studied in [5] within the framework of "performance estimation problems (PEPs)" (see remark (iv) following Algorithm 1). We also mention that local superlinear convergence rates for tensor methods were obtained in [7].…”
Section: Some Previous Contributions Based On the A-hpe Framework Of ...mentioning
confidence: 99%
“…(iv) We also mention that Algorithm 1 is closely related to a variant of the A-HPE for strongly convex objectives presented and studied in [5,Section 4.2]. However, in constrast to the analysis in [5], which is supported on "performance estimation problems (PEPs)", in this contribution we take an approach similar to the one which was taken in [18,23]. In doing so, we obtain global convergence rates for Algorithm 1 in terms of function values, sequences and (sub-)gradients (see Theorems 2.6 and 2.9).…”
Section: Some Previous Contributions Based On the A-hpe Framework Of ...mentioning
confidence: 99%
See 2 more Smart Citations