The solution of KKT systems is ubiquitous in optimization methods and often dominates the computation time, especially when large-scale problems are considered. Thus, the effective implementation of such methods is highly dependent on the availability of effective linear algebra algorithms and software, that are able, in turn, to take into account specific needs of optimization. In this paper we discuss the mutual impact of linear algebra and optimization, focusing on interior point methods and on the iterative solution of the KKT system. Three critical issues are addressed: preconditioning, termination control for the inner iterations, and inertia control.
Iterative solvers appear to be very promising in the development of efficient software, based on Interior Point methods, for large-scale nonlinear optimization problems. In this paper we focus on the use of preconditioned iterative techniques to solve the KKT system arising at each iteration of a Potential Reduction method for convex Quadratic Programming. We consider the augmented system approach and analyze the behaviour of the Constraint Preconditioner with the Conjugate Gradient algorithm. Comparisons with a direct solution of the augmented system and with MOSEK show the effectiveness of the iterative approach on large-scale sparse problems.
We present a technique for building effective and low cost preconditioners for sequences of shifted linear systems (A + αI)xα = b, where A is symmetric positive definite and α > 0. This technique updates a preconditioner for A, available in the form of an LDL T factorization, by modifying only the nonzero entries of the L factor in such a way that the resulting preconditioner mimics the diagonal of the shifted matrix and reproduces its overall behavior. This approach is supported by a theoretical analysis as well as by numerical experiments, showing that it works efficiently for a broad range of values of α.
We propose a framework for building preconditioners for sequences of linear systems of the form (A + ∆ k)x k = b k , where A is symmetric positive semidefinite and ∆ k is diagonal positive semidefinite. Such sequences arise in several optimization methods, e.g., in affine-scaling methods for bound-constrained convex quadratic programming and bound-constrained linear least squares, as well as in trust-region and overestimation methods for convex unconstrained optimization problems and nonlinear least squares. For all the matrices of a sequence, the preconditioners are obtained by updating any preconditioner for A available in the LDL T form. The preconditioners in the framework satisfy the natural requirement of being effective on slowly varying sequences; furthermore, under an additional property they are also able to cluster eigenvalues of the preconditioned matrix when some entries of ∆ k are sufficiently large. We present two low-cost preconditioners sharing the abovementioned properties and evaluate them on sequences of linear systems generated by the reflective Newton method applied to bound-constrained convex quadratic programming problems, and on sequences arising in solving nonlinear least-squares problems with the Regularized Euclidean Residual method. The results of the numerical experiments show the effectiveness of these preconditioners.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.