2008
DOI: 10.1016/j.ejor.2006.09.097
|View full text |Cite
|
Sign up to set email alerts
|

Nonconvex optimization using negative curvature within a modified linesearch

Abstract: This paper describes a new algorithm for the solution of nonconvex unconstrained optimization problems, with the property of converging to points satisfying second order necessary optimality conditions. The algorithm is based on a pro cedure which, from two descent directions, a Newton type direction and a direction of negative curvature, selects in each iteration the linesearch model best adapted to the properties of these directions. The paper also presents results of numeri cal experiments that illustrate i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0
1

Year Published

2008
2008
2024
2024

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 16 publications
(13 citation statements)
references
References 23 publications
0
12
0
1
Order By: Relevance
“…The rational behind using a different linesearch technique for each direction, is the possibility of capturing possible differences between the two directions. In [20] there is an attempt to match both the approaches above, in order to yield an efficient algorithm for small scale problems, adopting a monotone stabilization technique.…”
Section: Introductionmentioning
confidence: 98%
See 1 more Smart Citation
“…The rational behind using a different linesearch technique for each direction, is the possibility of capturing possible differences between the two directions. In [20] there is an attempt to match both the approaches above, in order to yield an efficient algorithm for small scale problems, adopting a monotone stabilization technique.…”
Section: Introductionmentioning
confidence: 98%
“…In particular, we observe that the state-of-the-art Newton-type methods are based on the idea of exploiting the local information on the function f (x), obtained by investigating the second order derivatives. In the context of large scale problems, the latter task is pursued by computing at the outer iteration k the pair (d k , s k ) of promising search directions [5,7,8,14,17,20], by means of efficient iterative techniques. Roughly speaking, d k summarizes the local convexity of f (x) at the current iterate, while s k takes into account the local nonconvexity of the objective function.…”
Section: Introductionmentioning
confidence: 99%
“…In [9] the alternative use of a negative curvature direction and a Newton-type direction was proposed, within an appropriate linesearch procedure. In the work of Olivares et al [17] the authors established criteria such that at each iteration, either a linesearch procedure using one direction (a Newton-type or a negative curvature direction) or a curvilinear search combining both directions, is performed. Curvilinear paths and negative curvature directions are also used in constrained optimization [16,18,19].…”
Section: ð1:2þmentioning
confidence: 99%
“…Update the parameters; k = k + 1 Until Convergence There are some works describing the advantages of using a "good" direction of negative curvature (see for example Prieto 2003a, 2003b;Gould et al 2000or Olivares et al 2008. In particular, in Olivares et al (2008) an adapted procedure for choosing the best combination of directions is proposed. In this work it is shown that in some iterations it is better to combine a Newton direction and a direction of negative curvature than to discard one of them.…”
Section: Nonlinear Optimization Algorithm Using Directions Of Negativmentioning
confidence: 99%