2018
DOI: 10.1137/17m1136845
|View full text |Cite
|
Sign up to set email alerts
|

Analysis of Optimization Algorithms via Integral Quadratic Constraints: Nonstrongly Convex Problems

Abstract: In this paper, we develop a unified framework able to certify both exponential and subexponential convergence rates for a wide range of iterative first-order optimization algorithms. To this end, we construct a family of parameter-dependent nonquadratic Lyapunov functions that can generate convergence rates in addition to proving asymptotic convergence. Using Integral Quadratic Constraints (IQCs) from robust control theory, we propose a Linear Matrix Inequality (LMI) to guide the search for the parameters of t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
134
0
1

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 99 publications
(140 citation statements)
references
References 31 publications
0
134
0
1
Order By: Relevance
“…To overcome this challenge, we present a modified semidefinite program that uses more general Lyapunov functions which are obtained by augmenting standard quadratic terms with the objective function. This type of generalized Lyapunov functions has been introduced in [46], [49] and used to study convergence of optimization algorithms for non-strongly convex problems. We employ a modified SDP to derive meaningful upper bounds on J in (3) for Nesterov's method as well.…”
Section: General Strongly Convex Problemsmentioning
confidence: 99%
See 1 more Smart Citation
“…To overcome this challenge, we present a modified semidefinite program that uses more general Lyapunov functions which are obtained by augmenting standard quadratic terms with the objective function. This type of generalized Lyapunov functions has been introduced in [46], [49] and used to study convergence of optimization algorithms for non-strongly convex problems. We employ a modified SDP to derive meaningful upper bounds on J in (3) for Nesterov's method as well.…”
Section: General Strongly Convex Problemsmentioning
confidence: 99%
“…to further tighten the constraints on the gradient ∇f and reduce conservativeness. In what follows, we build on the results of [46] and present an alternative LMI in Lemma 2 that is obtained using a Lyapunov function of the form…”
Section: General Strongly Convex Problemsmentioning
confidence: 99%
“…Note that the matrix inequality in (15) is linear in all the parameters except for α and λ. We can use the same technique as shown in Section III-B to transform (15) into an LMI when the stepsize α is fixed. Let…”
Section: Linear Convergence Of Tosmentioning
confidence: 99%
“…Then the same methods in the proof Appendix A-A and A-B apply here. If (15) holds, then v T k W 2 v k ≤ 0, which means V k+1 ≤ ρ 2 V k and linear convergence (16) holds.…”
Section: Proof Of Theoremmentioning
confidence: 99%
See 1 more Smart Citation