2012
DOI: 10.1137/10081085x
|View full text |Cite
|
Sign up to set email alerts
|

Local Convergence of Exact and Inexact Augmented Lagrangian Methods under the Second-Order Sufficient Optimality Condition

Abstract: Abstract. We establish local convergence and rate of convergence of the classical augmented Lagrangian algorithm under the sole assumption that the dual starting point is close to a multiplier satisfying the second-order sufficient optimality condition. In particular, no constraint qualifications of any kind are needed. Previous literature on the subject required, in addition, the linear independence constraint qualification and either the strict complementarity assumption or a stronger version of the second-o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

4
72
0

Year Published

2012
2012
2021
2021

Publication Types

Select...
4
4

Relationship

2
6

Authors

Journals

citations
Cited by 75 publications
(76 citation statements)
references
References 27 publications
4
72
0
Order By: Relevance
“…It should also be mentioned that IPOPT-C [49] apparently is not supported by any global convergence theory. AL-GENCAN definitely outperforms its competitors in terms of major iterations count, which is the sign of a higher convergence rate, in agreement with the local rate of convergence results in [20] that allow any kind of degeneracy. On the other hand, the cost of (especially late) iterations of ALGENCAN is rather high, and so its convergence rate does not translate into saved CPU time.…”
Section: Introductionsupporting
confidence: 77%
“…It should also be mentioned that IPOPT-C [49] apparently is not supported by any global convergence theory. AL-GENCAN definitely outperforms its competitors in terms of major iterations count, which is the sign of a higher convergence rate, in agreement with the local rate of convergence results in [20] that allow any kind of degeneracy. On the other hand, the cost of (especially late) iterations of ALGENCAN is rather high, and so its convergence rate does not translate into saved CPU time.…”
Section: Introductionsupporting
confidence: 77%
“…In particular, Assumptions 3.1-3.3 do not preclude the possibility that problem (NP) is infeasible. Recent work indicates that the iterates of the stabilized SQP subproblem exhibit superlinear convergence under mild conditions (see, e.g., [13,14,35,36]). In particular, neither strict complementarity nor a constraint qualification are required.…”
Section: It Follows From (224) Thatmentioning
confidence: 99%
“…For example, SOSC (12) was the only assumption needed to prove local convergence of the stabilized sequential quadratic programming method in [5] and of the augmented Lagrangian algorithm in [6], with the error bound (8) playing a key role. When there are equality constraints only, the error bound itself (equivalently, noncriticality of the multiplier) is enough in the case of stabilized sequential quadratic programming [17].…”
Section: Property 1 (Upper Lipschitz Stability Of the Solutions Of Kkmentioning
confidence: 99%