2017
DOI: 10.1137/16m108104x
|View full text |Cite
|
Sign up to set email alerts
|

Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization

Abstract: We provide a framework for computing the exact worst-case performance of any algorithm belonging to a broad class of oracle-based first-order methods for composite convex optimization, including those performing explicit, projected, proximal, conditional and inexact (sub)gradient steps. We simultaneously obtain tight worst-case guarantees and explicit instances of optimization problems on which the algorithm reaches this worst-case. We achieve this by reducing the computation of the worst-case to solving a con… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
153
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 88 publications
(154 citation statements)
references
References 33 publications
1
153
0
Order By: Relevance
“…This work takes place within the current effort for the development of systematic/computer-guided analyses and design of optimization algorithms. Among them, a systematic approach to lower bounds (which focuses on quadratic cases) is presented in by Arjevani et al in [1], a systematic use of control theory (via integral quadratic constraints) for developing upper bounds is presented by Lessard et al in [26], and the performance estimation approach, which aims at finding worst-case bounds was originally developed in [13] (see also surveys in [11] and [47]). Those methodologies are mostly presented as tools for performing worst-cases analyses (see the numerous examples in [11,19,47,48,50,51]), however, such techniques were also recently used to develop new methods with improved worst-case complexities.…”
Section: Links With Systematic and Computer-assisted Approaches To Womentioning
confidence: 99%
See 2 more Smart Citations
“…This work takes place within the current effort for the development of systematic/computer-guided analyses and design of optimization algorithms. Among them, a systematic approach to lower bounds (which focuses on quadratic cases) is presented in by Arjevani et al in [1], a systematic use of control theory (via integral quadratic constraints) for developing upper bounds is presented by Lessard et al in [26], and the performance estimation approach, which aims at finding worst-case bounds was originally developed in [13] (see also surveys in [11] and [47]). Those methodologies are mostly presented as tools for performing worst-cases analyses (see the numerous examples in [11,19,47,48,50,51]), however, such techniques were also recently used to develop new methods with improved worst-case complexities.…”
Section: Links With Systematic and Computer-assisted Approaches To Womentioning
confidence: 99%
“…Among them, a systematic approach to lower bounds (which focuses on quadratic cases) is presented in by Arjevani et al in [1], a systematic use of control theory (via integral quadratic constraints) for developing upper bounds is presented by Lessard et al in [26], and the performance estimation approach, which aims at finding worst-case bounds was originally developed in [13] (see also surveys in [11] and [47]). Those methodologies are mostly presented as tools for performing worst-cases analyses (see the numerous examples in [11,19,47,48,50,51]), however, such techniques were also recently used to develop new methods with improved worst-case complexities. Among others, such an approach was used in [13,22] to devise a fixed-step method that attains the best possible worst-case performance for smooth convex minimization [12], and later in [14] to obtain a variant of Kelley's cutting plane method with the best possible worst-case guarantee for non-smooth convex minimization.…”
Section: Links With Systematic and Computer-assisted Approaches To Womentioning
confidence: 99%
See 1 more Smart Citation
“…The paper is organized as follows: in Section 2, we present the problem statement and the main results; in Section 3, we prove our main results (tight convergence rates for the method and for its exact line search variant); in Section 4, we summarize known and newly derived tight results for the proximal gradient method, that were obtained using the performance estimation framework developed by Drori and Teboulle [9] and the authors (see [10,11]). Finally, we conclude the work in Section 5.…”
Section: Introductionmentioning
confidence: 99%
“…Since M J RK, the computational complexity of BCD-Net is dominated by forward and back projections performed in the MBIR modules. To reduce the M J factor, one can investigate faster optimization methods (e.g., proximal optimized gradient method (POGM) [13]) with ordered subsets. Applying these techniques can reduce the M J factor to (M/G)J , where G is the number of subsets and the number of POGM iterations J < J (e.g., J = (1/ √ 2)J) due to faster convergence rates of POGM over APG.…”
Section: Computational Complexitymentioning
confidence: 99%