1998
DOI: 10.1137/s1052623495295250
|View full text |Cite
|
Sign up to set email alerts
|

Curvilinear Stabilization Techniques for Truncated Newton Methods in Large Scale Unconstrained Optimization

Abstract: The aim of this paper is to define a new class of minimization algorithms for solving large scale unconstrained problems. In particular we describe a stabilization framework, based on a curvilinear linesearch, which uses a combination of a Newton-type direction and a negative curvature direction. The motivation for using negative curvature direction is that of taking into account local nonconvexity of the objective function. On the basis of this framework, we propose an algorithm which uses the Lanczos method … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

2
69
0
1

Year Published

1998
1998
2014
2014

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 54 publications
(74 citation statements)
references
References 21 publications
2
69
0
1
Order By: Relevance
“…In both cases, the computation of the negative curvature direction requires the factorization and the storage of a matrix, which is computationally expensive when the number of variables is large. In [11] the computation of the negative curvature direction was based on a Lanczos procedure. However, the storage of a matrix is required and only a few Lanczos vectors are stored.…”
Section: ð1:2þmentioning
confidence: 99%
“…In both cases, the computation of the negative curvature direction requires the factorization and the storage of a matrix, which is computationally expensive when the number of variables is large. In [11] the computation of the negative curvature direction was based on a Lanczos procedure. However, the storage of a matrix is required and only a few Lanczos vectors are stored.…”
Section: ð1:2þmentioning
confidence: 99%
“…In [6,21,31,37,38], this approach was embedded in a nonmonotonic framework and used to solving small and large-scale optimization problems. The scaling of two directions s k and d k are taken into account in [13].…”
Section: Introductionmentioning
confidence: 99%
“…Furthermore, as is known, the use of gradient-related directions ensures that any global minimizer has a region of attraction and hence, starting the local search from a point in this region, we can insert quickly a global minimizer in the set of "promising" points. As a local minimization procedure, we use the Newton-type method proposed in [7] which guarantees the convergence to second order critical points, i.e. to stationary points where the Hessian matrix is positive semidefinite.…”
Section: Introductionmentioning
confidence: 99%
“…In particular we propose an algorithm which combines a controlled random search procedure based on the modified Price algorithm described in [2] with a Newton-type unconstrained minimization algorithm proposed in [7]. More in particular, we exploit the skill of the Price strategy to examine the whole region of interest in order to locate the subregions "more promising" to contain a global minimizer.…”
mentioning
confidence: 99%
See 1 more Smart Citation