2019
DOI: 10.1007/s10107-019-01410-2
|View full text |Cite
|
Sign up to set email alerts
|

Efficient first-order methods for convex minimization: a constructive approach

Abstract: We describe a novel constructive technique for devising efficient first-order methods for a wide range of large-scale convex minimization settings, including smooth, non-smooth, and strongly convex minimization. The technique builds upon a certain variant of the conjugate gradient method to construct a family of methods such that a) all methods in the family share the same worst-case guarantee as the base conjugate gradient method, and b) the family includes a fixed-step first-order method. We demonstrate the … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

1
39
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 33 publications
(40 citation statements)
references
References 36 publications
1
39
0
Order By: Relevance
“…However, convergence rates are only one side of the coin; in several applications, e.g., in a data-based setting, also robustness with respect to various kinds of disturbances is a key issue and similar analysis tools have been developed [2,26]. In addition, the problem of designing algorithms specifically tailored to classes of structured optimization problems has only been touched upon so far [8,11,18]. However, in situations where a class of optimization problems needs to be solved repeatedly online, as for example in model predictive control or reinforcement learning, well-performing algorithms are key.…”
Section: Introductionmentioning
confidence: 99%
“…However, convergence rates are only one side of the coin; in several applications, e.g., in a data-based setting, also robustness with respect to various kinds of disturbances is a key issue and similar analysis tools have been developed [2,26]. In addition, the problem of designing algorithms specifically tailored to classes of structured optimization problems has only been touched upon so far [8,11,18]. However, in situations where a class of optimization problems needs to be solved repeatedly online, as for example in model predictive control or reinforcement learning, well-performing algorithms are key.…”
Section: Introductionmentioning
confidence: 99%
“…Universal gradient methods for convex problems were proposed in [29] and extended in [12] for non-convex problems and in [31] for primal-dual setting. Finally, there were some attempts to combine universality with small-dimension relaxation [8,14]. Concerning conjugate gradient methods, we refer the reader to a good survey [2] and the classical book [30].…”
Section: Introductionmentioning
confidence: 99%
“…Ak+1 , inequality(8) holds. For option b) in the step 4, (8) holds by the choice of a k+1 from the equation…”
mentioning
confidence: 97%
“…By combining the core idea of Nesterov's Universal Fast Gradient Method with the framework described by Allen-Zhu et al in [1], such a method was devised. As far as it is known to the authors of this paper, our work contains the first example of such a method, although a method utilising exact line search for solving minimization problems with convex Lipschitz continuous objectives was recently constructed by Drori et al [4]. Their work also contains an example of a universal method which uses an exact three dimension subspace minimization on each iteration.…”
Section: Introductionmentioning
confidence: 99%