2018
DOI: 10.1137/17m1134329
|View full text |Cite
|
Sign up to set email alerts
|

Complexity Analysis of Second-Order Line-Search Algorithms for Smooth Nonconvex Optimization

Abstract: There has been much recent interest in finding unconstrained local minima of smooth functions, due in part of the prevalence of such problems in machine learning and robust statistics. A particular focus is algorithms with good complexity guarantees. Second-order Newton-type methods that make use of regularization and trust regions have been analyzed from such a perspective. More recent proposals, based chiefly on first-order methodology, have also been shown to enjoy optimal iteration complexity rates, while … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
63
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 64 publications
(65 citation statements)
references
References 20 publications
2
63
0
Order By: Relevance
“…Proof. This result follows by largely the same argument as that of the proof of [37,Lemma 13]. The main difference is due to the result of Lemma 8 which, together with (4), implies (42)…”
Section: Proofsupporting
confidence: 52%
See 1 more Smart Citation
“…Proof. This result follows by largely the same argument as that of the proof of [37,Lemma 13]. The main difference is due to the result of Lemma 8 which, together with (4), implies (42)…”
Section: Proofsupporting
confidence: 52%
“…Replacing the Taylor series expansion around f in the proof of [37,Lemma 13] with this expression yields the result. We provide a full proof in Appendix A.2.…”
Section: Proofmentioning
confidence: 99%
“…In this section, we present a global worst-case complexity analysis of Algorithm 3. Elements of the analysis follow those in the earlier paper [26]. The…”
Section: Complexity Analysismentioning
confidence: 87%
“…In a previous work [26], two authors of the current paper proposed a Newton-based algorithm in a line-search framework which has an iteration complexity of O(max{ǫ −3/2 g , ǫ −3 H }) when the subproblems are solved exactly, and a computational complexity ofÕ ǫ −7/4 Hessian-vector products and/or gradient evaluations, when the subproblems are solved inexactly using CG and the randomized Lanczos algorithm. Compared to the accelerated gradient methods, this approach aligns more closely with traditional optimization practice, as described in Section 1.…”
Section: Complexity In Nonconvex Optimizationmentioning
confidence: 99%
See 1 more Smart Citation