Sequential optimality conditions provide adequate theoretical tools to justify stopping criteria for nonlinear programming solvers. Approximate KKT and Approximate Gradient Projection conditions are analyzed in this work. These conditions are not necessarily equivalent. Implications between different conditions and counter-examples will be shown. Algorithmic consequences will be discussed.
We present two new constraint qualifications (CQ) that are weaker than the recently introduced Relaxed Constant Positive Linear Dependence (RCPLD) constraint qualification. RCPLD is based on the assumption that many subsets of the gradients of the active constraints preserve positive linear dependence locally. A major open question was to identify the exact set of gradients whose properties had to be preserved locally and that would still work as a CQ. This is done in the first new constraint qualification, that we call Constant Rank of the Subspace Component (CRSC) CQ. This new CQ also preserves many of the good properties of RCPLD, like local stability and the validity of an error bound. We also introduce an even weaker CQ, called Constant Positive Generator (CPG), that can replace RCPLD in the analysis of the global convergence of algorithms. We close this work extending convergence results of algorithms belonging to all the main classes of nonlinear optimization methods: SQP, augmented Lagrangians, interior point algorithms, and inexact restoration. * This work was supported by PRONEX-Optimization (PRONEX-CNPq/FAPERJ E-26/171.510/2006-APQ1), Fapesp (Grants
In this paper we consider the minimization of a continuous function that is potentially not differentiable or not twice differentiable on the boundary of the feasible region. By exploiting an interior point technique, we present first-and second-order optimality conditions for this problem that reduces to classical ones when the derivative on the boundary is available. For this type of problems, existing necessary conditions often rely on the notion of subdifferential or become non-trivially weaker than the KKT condition in the (twice-)differentiable counterpart problems. In contrast, this paper presents a new set of first-and second-order necessary conditions that are derived without the use of subdifferential and reduces to exactly the KKT condition when (twice-)differentiability holds. As a result, these conditions are stronger than some existing ones considered for the discussed minimization problem when only non-negativity constraints are present. To solve for these optimality conditions in the special but important case of linearly constrained problems, we present two novel interior trust-region point algorithms and show that their worstcase computational efficiency in achieving the potentially stronger optimality conditions match the best known complexity bounds. Since this work considers a more general problem than the literature, our results also indicate that best known complexity bounds hold for a wider class of nonlinear programming problems.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.