2010
DOI: 10.1137/090763317
|View full text |Cite
|
Sign up to set email alerts
|

A Moving Balls Approximation Method for a Class of Smooth Constrained Minimization Problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

2
58
0

Year Published

2012
2012
2024
2024

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 26 publications
(60 citation statements)
references
References 4 publications
2
58
0
Order By: Relevance
“…Hence, using the surrogate model (7) is similar to the approach used in Auslender et al (2010), and the feasibility proofs follow the same line of arguments. Note however that, in Auslender et al (2010), gradient Lipschitz constants are used instead of the Hessian upper bound matricesQ.…”
Section: Rto With Qcqp Surrogate Modelmentioning
confidence: 97%
See 2 more Smart Citations
“…Hence, using the surrogate model (7) is similar to the approach used in Auslender et al (2010), and the feasibility proofs follow the same line of arguments. Note however that, in Auslender et al (2010), gradient Lipschitz constants are used instead of the Hessian upper bound matricesQ.…”
Section: Rto With Qcqp Surrogate Modelmentioning
confidence: 97%
“…(Nominal plant feasibility). The scheme (7) can be understood as an RTO-specific translation of the nominal optimization method suggested in (Auslender et al, 2010). Therein, it is shown that the QCQP formulation leads to recursively feasible iterates provided that the gradient information is exact and the following assumptions hold:…”
Section: Rto With Qcqp Surrogate Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…We seek an algorithm that is, on one hand, free of unknown parameters, as opposed to the smoothing methods alluded above, and on the other hand, capable to improve over subgradient-like schemes without having/requiring any favorable geometry, and yet produce an O(1/ε) efficiency estimate. We achieve these goals by combining duality with an idea we recently introduced in [2] through the Moving Ball Algorithm (MBA) that we developed for solving smooth nonlinear programs, whereby the smooth objective is linearized and regularized, and the feasible set was approximated by a sequence of balls. In essence, the MBA derived in [2] can be viewed as a natural extension of the gradient projection method for handling a smooth objective with smooth inequality constraints.…”
Section: 4 For Examplesmentioning
confidence: 99%
“…We achieve these goals by combining duality with an idea we recently introduced in [2] through the Moving Ball Algorithm (MBA) that we developed for solving smooth nonlinear programs, whereby the smooth objective is linearized and regularized, and the feasible set was approximated by a sequence of balls. In essence, the MBA derived in [2] can be viewed as a natural extension of the gradient projection method for handling a smooth objective with smooth inequality constraints. Here, our objective function being nonsmooth, we keep it intact, and we adopt the ball approximation for the smooth constraint.…”
Section: 4 For Examplesmentioning
confidence: 99%