Neither the Council nor the Laboratory accept any responsibility for loss or damage arising from the use of information contained in any of their reports or in any communication about their tests or investigations.
The aim of this paper is to define a new class of minimization algorithms for solving large scale unconstrained problems. In particular we describe a stabilization framework, based on a curvilinear linesearch, which uses a combination of a Newton-type direction and a negative curvature direction. The motivation for using negative curvature direction is that of taking into account local nonconvexity of the objective function. On the basis of this framework, we propose an algorithm which uses the Lanczos method for determining at each iteration both a Newton-type direction and an effective negative curvature direction. The results of extensive numerical testing are reported together with a comparison with the LANCELOT package. These results show that the algorithm is very competitive, which seems to indicate that the proposed approach is promising.
In this paper we propose efficient new linesearch algorithms for solving large scale unconstrained optimization problems which exploit any local nonconvexity of the objective function. Current algorithms in this class typically compute a pair of search directions at every iteration: a Newton-type direction, which ensures both global and fast asymptotic convergence, and a negative curvature direction, which enables the iterates to escape from the region of local non-convexity. A new point is generated by performing a search along a line or a curve obtained by combining these two directions. However, in almost all if these algorithms, the relative scaling of the directions is not taken into account.We propose a new algorithm which accounts for the relative scaling of the two directions. To do this, only the most promising of the two directions is selected at any given iteration, and a linesearch is performed along the chosen direction. The appropriate direction is selected by estimating the rate of decrease of the quadratic model of the objective function in both candidate directions. We prove global convergence to second-order critical points for the new algorithm, and report some preliminary numerical results.
In this paper we deal with the iterative computation of negative curvature directions of an objective function, within large scale optimization frameworks. In particular, suitable directions of negative curvature of the objective function represent an essential tool, to guarantee convergence to second order critical points. However, an "adequate" negative curvature direction is often required to have a good resemblance to an eigenvector corresponding to the smallest eigenvalue of the Hessian matrix. Thus, its computation may be a very difficult task on large scale problems. Several strategies proposed in literature compute such a direction relying on matrix factorizations, so that they may be inefficient or even impracticable in a large scale setting. On the other hand, the iterative methods proposed either need to store a large matrix, or they need to rerun the recurrence. On this guideline, in this paper we propose the use of an iterative method, based on a planar Conjugate Gradient scheme. Under mild assumptions, we provide theory for using the latter method to compute adequate negative curvature directions, within optimization frameworks. In our proposal any matrix storage is avoided, along with any additional rerun. 漏 2007 Springer Science+Business Media, LLC
We consider an iterative preconditioning technique for non-convex large scale optimization. First, we refer to the solution of large scale indefinite linear systems by using a Krylov subspace method, and describe the iterative construction of a preconditioner which does not involve matrices products or matrices storage. The set of directions generated by the Krylov subspace method is used, as by product, to provide an approximate inverse preconditioner. Then, we experience our preconditioner within Truncated Newton schemes for large scale unconstrained optimization, where we generalize the truncation rule by Nash-Sofer (Oper. Res. Lett. 9:219-221, 1990) to the indefinite case, too. We use a Krylov subspace method to both approximately solve the Newton equation and to construct the preconditioner to be used at the current outer iteration. An extensive numerical experience shows that the proposed preconditioning strategy, compared with the unpreconditioned strategy and PREQN (Morales and Nocedal in SIAM J. Optim. 10:1079Optim. 10: -1096Optim. 10: , 2000, may lead to a reduction of the overall inner iterations. Finally, we show that our proposal has some similarities with the Limited Memory Preconditioners (Gratton et al. in SIAM J. Optim. 21:912-935, 2011).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations鈥揷itations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright 漏 2024 scite LLC. All rights reserved.
Made with 馃挋 for researchers
Part of the Research Solutions Family.