The focus in this paper is interior-point methods for bound-constrained nonlinear optimization, where the system of nonlinear equations that arise are solved with Newton’s method. There is a trade-off between solving Newton systems directly, which give high quality solutions, and solving many approximate Newton systems which are computationally less expensive but give lower quality solutions. We propose partial and full approximate solutions to the Newton systems. The specific approximate solution depends on estimates of the active and inactive constraints at the solution. These sets are at each iteration estimated by basic heuristics. The partial approximate solutions are computationally inexpensive, whereas a system of linear equations needs to be solved for the full approximate solution. The size of the system is determined by the estimate of the inactive constraints at the solution. In addition, we motivate and suggest two Newton-like approaches which are based on an intermediate step that consists of the partial approximate solutions. The theoretical setting is introduced and asymptotic error bounds are given. We also give numerical results to investigate the performance of the approximate solutions within and beyond the theoretical framework.
The main focus in this paper is exact linesearch methods for minimizing a quadratic function whose Hessian is positive definite. We give a class of limited-memory quasi-Newton Hessian approximations which generate search directions parallel to those of the BFGS method, or equivalently, to those of the method of preconditioned conjugate gradients. In the setting of reduced Hessians, the class provides a dynamical framework for the construction of limited-memory quasi-Newton methods. These methods attain finite termination on quadratic optimization problems in exact arithmetic. We show performance of the methods within this framework in finite precision arithmetic by numerical simulations on sequences of related systems of linear equations, which originate from the CUTEst test collection. In addition, we give a compact representation of the Hessian approximations in the full Broyden class for the general unconstrained optimization problem. This representation consists of explicit matrices and gradients only as vector components.
The focus in this work is on interior-point methods for inequality-constrained quadratic programs, and particularly on the system of nonlinear equations to be solved for each value of the barrier parameter. Newton iterations give high quality solutions, but we are interested in modified Newton systems that are computationally less expensive at the expense of lower quality solutions. We propose a structured modified Newton approach where each modified Jacobian is composed of a previous Jacobian, plus one low-rank update matrix per succeeding iteration. Each update matrix is, for a given rank, chosen such that the distance to the Jacobian at the current iterate is minimized, in both 2-norm and Frobenius norm. The approach is structured in the sense that it preserves the nonzero pattern of the Jacobian. The choice of update matrix is supported by results in an ideal theoretical setting. We also produce numerical results with a basic interior-point implementation to investigate the practical performance within and beyond the theoretical framework. In order to improve performance beyond the theoretical framework, we also motivate and construct two heuristics to be added to the method.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.