2013
DOI: 10.1007/s13675-012-0003-7
|View full text |Cite
|
Sign up to set email alerts
|

Worst case complexity of direct search

Abstract: In this paper we prove that direct search of directional type shares the worst case complexity bound of steepest descent when sufficient decrease is imposed using a quadratic function of the step size parameter. This result is proved under smoothness of the objective function and using a framework of the type of GSS (generating set search). We also discuss the worst case complexity of direct search when only simple decrease is imposed and when the objective function is nonsmooth.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
97
0
1

Year Published

2014
2014
2017
2017

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 64 publications
(101 citation statements)
references
References 17 publications
3
97
0
1
Order By: Relevance
“…Increasing C, for instance, will decrease the number of successful iterations [17, Theorem 3.1], possibly leading to more unnecessary unsuccessful iterations and consequently more unnecessary function evaluations. Increasing the value of the expansion factor γ ≥ 1 will increase the maximum number of unsuccessful iterations compared to the number of successful ones [17,Theorem 3.2], again possibly leading to more unnecessary unsuccessful iterations and consequently more unnecessary function evaluations. Setting γ = 1 leads to an optimal choice in this respect.…”
Section: Search Stepmentioning
confidence: 99%
See 2 more Smart Citations
“…Increasing C, for instance, will decrease the number of successful iterations [17, Theorem 3.1], possibly leading to more unnecessary unsuccessful iterations and consequently more unnecessary function evaluations. Increasing the value of the expansion factor γ ≥ 1 will increase the maximum number of unsuccessful iterations compared to the number of successful ones [17,Theorem 3.2], again possibly leading to more unnecessary unsuccessful iterations and consequently more unnecessary function evaluations. Setting γ = 1 leads to an optimal choice in this respect.…”
Section: Search Stepmentioning
confidence: 99%
“…We know from [17] that, in this bound, only the minimum cosine measure of the positive spanning sets depends explicitly on n. One also knows from the positive spanning set formed by the coordinate vectors and their negatives that such minimum cosine measure can be set greater than or equal to 1/ √ n, and thus 1/ω ≤ O(np 2 ), where ω is given in (7). On the other hand, each poll step when using such positive spanning sets costs at most O(n) function evaluations.…”
Section: Theorem 42 Let Assumptions 42 Holdmentioning
confidence: 99%
See 1 more Smart Citation
“…However, it should be noted that the stronger requirement imposed by the sufficient decrease condition (over the simple one) does not necessarily bring to the definition of less efficient algorithms both in terms of number of function evaluations and of final function value attained by the search routine. Further, as recently evidenced in [28], the imposition of a sufficient decrease condition, like the one adopted in the present paper, allows to derive a worst case complexity bound on the number of iteration of a direct search algorithm to drive the norm of the objective gradient below a prefixed accuracy like the one obtained for the steepest descent method in [25] in the presence of first order derivatives. On the contrary, if a simple decrease condition is imposed, the worst case complexity bound on the number of iterations seems only provable under additional strong conditions like the objective function satisfying an appropriate decrease rate.…”
Section: Introductionmentioning
confidence: 77%
“…From (26) and (31) we obtain (28). Then, by (25), (27), (28) and by considering that, by assumption, f (x) = f (x ⋆ ), we get (29).…”
Section: Lemma 5 the Local Search Procedures Is Well-defined (Ie It mentioning
confidence: 95%