Proceedings of the International Congress of Mathematicians (ICM 2018) 2019
DOI: 10.1142/9789813272880_0198
|View full text |Cite
|
Sign up to set email alerts
|

Worst-Case Evaluation Complexity and Optimality of Second-Order Methods for Nonconvex Smooth Optimization

Abstract: We establish or refute the optimality of inexact second-order methods for unconstrained nonconvex optimization from the point of view of worst-case evaluation complexity, improving and generalizing the results of [15,19]. To this aim, we consider a new general class of inexact second-order algorithms for unconstrained optimization that includes regularization and trust-region variations of Newton's method as well as of their linesearch variants. For each method in this class and arbitrary accuracy threshold ǫ … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
35
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 23 publications
(37 citation statements)
references
References 28 publications
2
35
0
Order By: Relevance
“…Our results extend worst-case evaluation complexity bounds for smooth nonconvex optimization in [11,12] which do not use the structure of partially separable functions and do not consider the Lipschitzian singularity. Moreover, our results subsume the results for non-Lipschitz nonconvex optimization in [17] which only consider the complexity with q = 1 and n i = 1 for i ∈ H.…”
Section: Introductionsupporting
confidence: 57%
See 1 more Smart Citation
“…Our results extend worst-case evaluation complexity bounds for smooth nonconvex optimization in [11,12] which do not use the structure of partially separable functions and do not consider the Lipschitzian singularity. Moreover, our results subsume the results for non-Lipschitz nonconvex optimization in [17] which only consider the complexity with q = 1 and n i = 1 for i ∈ H.…”
Section: Introductionsupporting
confidence: 57%
“…using the step termination criteria (4.1) and (4.2) which again replace a simpler version again based solely on first-order information. The second is that the PSARp algorithm applies to the more general problem (1.1), in particular using the isotropic model (3.12) to allow n i > 1 for i ∈ H. As alluded to above and discussed in [12] and [4], the potential termination of the algorithm in Step 2 can only happen whenever q > 2 and x k = x ǫ is an (ǫ, 1)-approximate p-th-ordernecessary minimizer within R k , which, together with (2.5), implies that the same property holds for problem (1.1). This is a significantly stronger optimality condition than what is required by (4.5).…”
Section: The Adaptive Regularization Algorithmmentioning
confidence: 99%
“…Our bounds on deterministic algorithms require dimension d = 1 + 2T ( ), while our bounds on all randomized algorithms require d = c·T ( ) 2 log T ( ) for a numerical constant c < ∞. In contrast, the results of Vavasis [44] and Cartis et al [16,17,18,20] hold with d = 2 independent of . Inevitably, they do so at a cost; the lower bound [44] is loose, while the lower bounds [16,17,18,20] apply to only restricted algorithm classes that exclude the aforementioned grid-search and cutting-plane algorithms.…”
Section: The Importance Of High-dimensional Constructionsmentioning
confidence: 86%
“…Cartis et al also develop algorithm-specific lower bounds on the iteration complexity of finding approximate stationary points. Their works [16,17] show that the performance guarantees for gradient descent and cubic regularization of Newton's method are tight for two-dimensional functions they construct, and they also extend these results to certain structured classes of methods [18,20].…”
Section: Related Lower Boundsmentioning
confidence: 99%
“…While upper complexity bounds are important as they provide a handle on the intrinsic difficulty of the considered problem, they do so at the condition of not being overly pessimistic. To address this last point, lower bounds on the evaluation complexity of unconstrained nonconvex optimization problems and methods were derived in [4,17] and [12], where it was shown that the known upper complexity bounds are sharp (irrespective of problem's dimension) for most known methods using Taylor's models of degree one or two. That is to say that there are examples for which the complexity order predicted by the upper bound is actually achieved.…”
Section: Introductionmentioning
confidence: 99%