2018 Annual American Control Conference (ACC) 2018
DOI: 10.23919/acc.2018.8430824
|View full text |Cite
|
Sign up to set email alerts
|

A Robust Accelerated Optimization Algorithm for Strongly Convex Functions

Abstract: This work proposes an accelerated first-order algorithm we call the Robust Momentum Method for optimizing smooth strongly convex functions. The algorithm has a single scalar parameter that can be tuned to trade off robustness to gradient noise versus worst-case convergence rate. At one extreme, the algorithm is faster than Nesterov's Fast Gradient Method by a constant factor but more fragile to noise. At the other extreme, the algorithm reduces to the Gradient Method and is very robust to noise. The algorithm … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
60
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 54 publications
(63 citation statements)
references
References 9 publications
2
60
0
Order By: Relevance
“…In particular, such new methods were obtained by optimization of their algorithmic parameters in [11,19,20], and by analogy with conjugate-gradient type methods (doing greedy span-searches) in [13]. Performance estimation is also related to the line of work on integral quadratic constraints started by Lessard, Recht, and Packard [23], which also allowed designing optimized methods, see [8,38]. The approach in [23] may be seen as a (relaxed) version of an SDP performance estimation problem where Lyapunov functions are used to certify error bounds [37].…”
mentioning
confidence: 99%
“…In particular, such new methods were obtained by optimization of their algorithmic parameters in [11,19,20], and by analogy with conjugate-gradient type methods (doing greedy span-searches) in [13]. Performance estimation is also related to the line of work on integral quadratic constraints started by Lessard, Recht, and Packard [23], which also allowed designing optimized methods, see [8,38]. The approach in [23] may be seen as a (relaxed) version of an SDP performance estimation problem where Lyapunov functions are used to certify error bounds [37].…”
mentioning
confidence: 99%
“…Theorem 3.4: System (33) is observable from the output y defined in (5). The proof of Theorem 3.4 is given in Appendix D.…”
Section: Identifiability Of the Parametersmentioning
confidence: 95%
“…In particular, accelerated/momentum algorithm, e.g. Heavy ball method and Nesterov acceleration (see, e.g., [5]).…”
Section: Gradient Descent Algorithmsmentioning
confidence: 99%
“…which consist of the objective function evaluated at Cψ and a quadratic function of ψ, where X is a positive definite matrix. The theory of IQCs provides a convex control-theoretic approach to analyzing optimization algorithms [33] and it was recently employed to study convergence and robustness of the first-order algorithms [14]- [17], [32], [35], [36]. The type of generalized Lyapunov functions given by (19) was introduced in [32], [37] to study convergence of optimization algorithms for non-strongly convex problems.…”
Section: B General Strongly Convex Problemsmentioning
confidence: 99%