2008
DOI: 10.1016/j.cam.2007.07.017
|View full text |Cite
|
Sign up to set email alerts
|

A derivative-free nonmonotone line-search technique for unconstrained optimization

Abstract: A tolerant derivative-free nonmonotone line-search technique is proposed and analyzed. Several consecutive increases in the objective function and also nondescent directions are admitted for unconstrained minimization. To exemplify the power of this new line search we describe a direct search algorithm in which the directions are chosen randomly. The convergence properties of this random method rely exclusively on the line-search technique. We present numerical experiments, to illustrate the advantages of usin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
30
0
5

Year Published

2010
2010
2022
2022

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 39 publications
(35 citation statements)
references
References 27 publications
0
30
0
5
Order By: Relevance
“…More related to our work is the contribution of [10], a derivative-free line search method that uses random searching directions, where at each iteration the step size is defined by a tolerant nonmonotone backtracking line search that always starts from the unitary step. The authors proved global convergence with probability one in [10], but no global rate was provided.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…More related to our work is the contribution of [10], a derivative-free line search method that uses random searching directions, where at each iteration the step size is defined by a tolerant nonmonotone backtracking line search that always starts from the unitary step. The authors proved global convergence with probability one in [10], but no global rate was provided.…”
Section: Introductionmentioning
confidence: 99%
“…The authors proved global convergence with probability one in [10], but no global rate was provided. In addition, global convergence in [10] is much easier to establish than in our paper, since the backtracking line search always starts from the unitary step, independently of the previous iterations, and consequently the step size is little coupled to the history of the computation.…”
Section: Introductionmentioning
confidence: 99%
“…( where X k :¼ X(ks) with k = 0, 1, 2,…, and step-size h would be selected appropriately via the line-search algorithm [8,[17][18][19] (presented in the ensuing section).…”
Section: Discrete-time Znn Modelmentioning
confidence: 99%
“…Different activation functions and different step sizes (which could be obtained via a line-search algorithm [8,[17][18][19]) are employed for such a model. When the linear activation function and step-size h:1 are used, the discrete-time ZNN model reduces exactly to Newton iteration for matrix square root finding.…”
Section: Introductionmentioning
confidence: 99%
“…As we already mentioned, some of the gradient free approaches, see for example [5,7], will probably make these algorithms more applicable in practice. Therefore, it will be the subject of future research.…”
Section: Numerical Results For Noisy Problemsmentioning
confidence: 99%