2018
DOI: 10.1109/lcsys.2017.2722406
|View full text |Cite
|
Sign up to set email alerts
|

The Fastest Known Globally Convergent First-Order Method for Minimizing Strongly Convex Functions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
89
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 114 publications
(91 citation statements)
references
References 10 publications
2
89
0
Order By: Relevance
“…Note that the bound for the SSEP method was generated for the form (20); bounds for the efficient form (21) behave almost exactly like the bounds for the form (20)-the difference could not be observed from this plot-and were therefore omitted from the comparison. Finally, the numerical results presented above support the conjectures that the subspace-searching GFOM enjoys an O((1 − √ 2 √ κ −1 ) N R 2 x ) rate of convergence while the SSEP-based method that does not perform any line searches, converges at the faster rate O((1 − 2 √ κ −1 ) N R 2 x ) (matching the rate of convergence bounds on TMM [52,Theorem 1]). This is not in contraction with the theory, as noted in Remark 3, however, we are currently unable to find an intuitive explanation to this phenomenon besides the algebraic observation that the PEPs corresponding to the methods (20) and (21) have less degrees of freedom than the PEP for GFOM.…”
Section: Ssep-based Gradient Methods For Smooth Strongly Convex Minimisupporting
confidence: 72%
See 2 more Smart Citations
“…Note that the bound for the SSEP method was generated for the form (20); bounds for the efficient form (21) behave almost exactly like the bounds for the form (20)-the difference could not be observed from this plot-and were therefore omitted from the comparison. Finally, the numerical results presented above support the conjectures that the subspace-searching GFOM enjoys an O((1 − √ 2 √ κ −1 ) N R 2 x ) rate of convergence while the SSEP-based method that does not perform any line searches, converges at the faster rate O((1 − 2 √ κ −1 ) N R 2 x ) (matching the rate of convergence bounds on TMM [52,Theorem 1]). This is not in contraction with the theory, as noted in Remark 3, however, we are currently unable to find an intuitive explanation to this phenomenon besides the algebraic observation that the PEPs corresponding to the methods (20) and (21) have less degrees of freedom than the PEP for GFOM.…”
Section: Ssep-based Gradient Methods For Smooth Strongly Convex Minimisupporting
confidence: 72%
“…Since the approach is limited by our capability to accurately solve SDPs, it is important to note that PEPs can be used again for validating the performances of the final method (shown in the one before last row in Table 1). Figure 1 presents a comparison of the worst-case bounds in the case κ = 100 for GFOM, for an SSEP-based method performing no line searches, the celebrated fast (or accelerated) gradient method (FGM) for smooth strongly convex minimization [36,Theorem 2.1.12], and the very recent triple momentum method (TMM) [52]. These worst-case bounds were derived numerically by solving the corresponding PEPs using the interpolation conditions presented in Example 1 (see [13,50] for details on the derivation of PEPs for fixed-step methods).…”
Section: Ssep-based Gradient Methods For Smooth Strongly Convex Minimimentioning
confidence: 99%
See 1 more Smart Citation
“…As z k = prox αg (2y k − x k ) and z = prox αg (2y − x ), we similarly conclude that e k Q (2) e k ≥ 0 is implied from the incremental QC that prox αg satisfies. Returning to (17), if we multiply from the left and right by e k and e k respectively, we obtain…”
Section: Convergence Certificates Via Matrix Inequalitiesmentioning
confidence: 99%
“…We now select algorithm parameters that satisfy the matrix inequality in (17). In doing so, we arrive at a new and simple proof of the O(1/k) convergence of DRS in the non-strongly convex and non-smooth case.…”
Section: ) Case 1: Non-strongly Convex and Non-smooth Casementioning
confidence: 99%