1982
DOI: 10.1007/bfb0120949
|View full text |Cite
|
Sign up to set email alerts
|

A projected Lagrangian algorithm and its implementation for sparse nonlinear constraints

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
94
0
5

Year Published

1986
1986
2013
2013

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 277 publications
(99 citation statements)
references
References 25 publications
0
94
0
5
Order By: Relevance
“…Linearly constrained Lagrangian (LCL) methods are traditionally stated for MP problems with equality constraints and bound constraints (inequality constraints are reformulated introducing slacks); see [14,24,27]. We therefore consider MP problem (1.3) with bound constraints given by G(x) = −x, x ∈ R n .…”
Section: Linearly Constrained Lagrangian Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Linearly constrained Lagrangian (LCL) methods are traditionally stated for MP problems with equality constraints and bound constraints (inequality constraints are reformulated introducing slacks); see [14,24,27]. We therefore consider MP problem (1.3) with bound constraints given by G(x) = −x, x ∈ R n .…”
Section: Linearly Constrained Lagrangian Methodsmentioning
confidence: 99%
“…In what follows, we shall extend the framework for dealing with Newton-related algorithms that can be regarded as inexact JNM (iJNM). These will include the stabilized version of SQP [12,13,16,[33][34][35], sequential quadratically constrained quadratic programming [2,11,15,31], and linearly constrained Lagrangian methods [14,24,27]. Formally, instead of (1.2), the next iterate z k+1 would now satisfy the (perturbed) GE…”
Section: Z(r) −Z = O( R )mentioning
confidence: 99%
“…There are two principal difficulties with such an approach: first, inclusion of the governing equations forces the analysis software to be embedded within the optimizer; and second, the PDE constraints induce sparse constraint Jacobian and Lagrangian Hessian matrices [24], thus necessitating sparse optimization strategies for large problems. Addressing the second problem with a generalpurpose sparse optimizer, such as MINOS [22], is problematic: the favorable structure of the constraint Jacobian with respect to state variables (i.e. the tangent stiffness matrix, which in the subsonic flow case is symmetric positive definite with nonzeroes corresponding to edges of a planar graph) cannot be exploited.…”
Section: An Infeasible Path Methods For Optimization Of Systems Governmentioning
confidence: 99%
“…The columns labeled m, n, and nz give the number of constraints, variables, and constraint nonzeros in the problem seen by the solver; in this particular example, the number of variables does not change, but it does in other examples. The time column shows execution times for MINOS 5.5, a nonlinear solver by Murtagh and Saunders (1982) that uses an activeset strategy and solves linear problems (such as the git problems) as a special case. MINOS benefits from AMPL's presolve phase, as it does not have its own presolver, and it is affected by the var_bounds setting.…”
Section: Presolvementioning
confidence: 99%