2019
DOI: 10.1007/s10107-019-01406-y
|View full text |Cite
|
Sign up to set email alerts
|

Lower bounds for finding stationary points I

Abstract: We prove lower bounds on the complexity of finding -stationary points (points x such that ∇f (x) ≤ ) of smooth, high-dimensional, and potentially non-convex functions f . We consider oracle-based complexity measures, where an algorithm is given access to the value and all derivatives of f at a query point x. We show that for any (potentially randomized) algorithm A, there exists a function f with Lipschitz pth order derivatives such that A requires at least −(p+1)/p queries to find an -stationary point. Our lo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

4
154
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 104 publications
(158 citation statements)
references
References 35 publications
4
154
0
Order By: Relevance
“…Also, the bounds for ARp with p = 2 and 2 < r ≤ 2 + β 2 and β 2 ∈ (0, 1] are sharp and optimal for the corresponding smoothness classes [10]. We also note that for general p, r = p + 1 and β p = 1 (the Lipschitz continuous case), [7] shows the bounds for (possibly randomized) ARp variants (in [3]) to be sharp and optimal. The difficult example functions in [7] increase in dimension with p, in contrast to uni-or bi-variate examples in [10,11].…”
Section: General Discussion Of the Complexity Boundsmentioning
confidence: 96%
“…Also, the bounds for ARp with p = 2 and 2 < r ≤ 2 + β 2 and β 2 ∈ (0, 1] are sharp and optimal for the corresponding smoothness classes [10]. We also note that for general p, r = p + 1 and β p = 1 (the Lipschitz continuous case), [7] shows the bounds for (possibly randomized) ARp variants (in [3]) to be sharp and optimal. The difficult example functions in [7] increase in dimension with p, in contrast to uni-or bi-variate examples in [10,11].…”
Section: General Discussion Of the Complexity Boundsmentioning
confidence: 96%
“…It is thus an unconditional statement that does not depend on conjectures such as P = N P in complexity theory. We also note that if the goal is only to find stationary points instead of the optimum, then the problem becomes easier, requiring only (1/ ) 2 gradient queries to converge [12].…”
Section: Exponential Dependence On Dimension For Optimizationmentioning
confidence: 99%
“…This paper focusses on the latter class and follows a now subtantial (1) trend of research where bounds on the worst-case evaluation complexity (or oracle complexity) of obtaining first-and (more rarely) second-order-necessary minimizers (2) for nonlinear nonconvex unconstrained optimization problems [21,17,14,19,5]. These papers all provide upper evaluation complexity bounds: they show that, to obtain an ǫ-approximate first-order-necessary minimizer (for unconstrained problem, this is a point at which the gradient of the objective function is less than ǫ in norm), at most O(ǫ −2 ) evaluations of the objective function (3) are needed if a model involving first derivatives is used, and at most O(ǫ −3/2 ) evaluations are needed if using second derivatives is permitted. This result was extended to convexly-constrained problems in [6].…”
Section: Introductionmentioning
confidence: 99%
“…That is to say that there are examples for which the complexity order predicted by the upper bound is actually achieved. More recently, Carmon et al [3] provided an elaborate construction showing that at least a multiple of ǫ − p+1 p function evaluations may be needed to obtain an ǫ-first-order-necessary unconstrained minimizer where derivatives of order at most p are used. This result, which matches in order the upper bound of [2], covers a very wide class of potential optimization methods (4) but has the drawback of being only valid for problems whose dimension essentially exceeds the number of iterations needed, which can be very large and quickly grows when ǫ tends to zero.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation