2013
DOI: 10.1137/110856253
|View full text |Cite
|
Sign up to set email alerts
|

Inexact Restoration Method for Derivative-Free Optimization with Smooth Constraints

Abstract: A new method is introduced for solving constrained optimization problems in which the derivatives of the constraints are available but the derivatives of the objective function are not. The method is based on the Inexact Restoration framework, by means of which each iteration is divided in two phases. In the first phase one considers only the constraints, in order to improve feasibility. In the second phase one minimizes a suitable objective function subject to a linear approximation of the constraints. The se… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
31
0
5

Year Published

2014
2014
2020
2020

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 42 publications
(36 citation statements)
references
References 47 publications
0
31
0
5
Order By: Relevance
“…This framework has been developed in [17,16,7] and it proved useful for analyzing, in a unified manner, a number of different Newtonian and Newton-related algorithms for constrained optimization (truncated and augmented Lagrangian modifications of SQP itself, sequential quadratically constrained quadratic programming, and linearly constrained Lagrangian methods, to mention some of the applications); see [20,Chapter 4]. In this paper we continue this line of reasoning and show that in addition to the above, local convergence properties of the inexact restoration methods [22,21,23,3,10,8,4] and of composite-step SQP methods [26,24], [6,Section 15.4], can also be derived from the pSQP theory. The paper is organized as follows.…”
Section: Introductionmentioning
confidence: 55%
See 1 more Smart Citation
“…This framework has been developed in [17,16,7] and it proved useful for analyzing, in a unified manner, a number of different Newtonian and Newton-related algorithms for constrained optimization (truncated and augmented Lagrangian modifications of SQP itself, sequential quadratically constrained quadratic programming, and linearly constrained Lagrangian methods, to mention some of the applications); see [20,Chapter 4]. In this paper we continue this line of reasoning and show that in addition to the above, local convergence properties of the inexact restoration methods [22,21,23,3,10,8,4] and of composite-step SQP methods [26,24], [6,Section 15.4], can also be derived from the pSQP theory. The paper is organized as follows.…”
Section: Introductionmentioning
confidence: 55%
“…Section 3 considers an "exact restoration" scheme, which is not a practical algorithm but rather serves as a natural first step to the analysis of inexact restoration methods, presented in Section 4. Inexact restoration methods have been receiving much attention in recent years; see [22,21,23,3,10,8,4]. Our considerations are related to the local framework of [3].…”
Section: Introductionmentioning
confidence: 99%
“…In [108] an algorithm is proposed for problems with 'thin' constraints, based on relaxing feasibility and performing a subproblem restoration procedure. Inexact restoration has been applied in [38] to optimization problems where derivatives of the constraints are available for use, thus allowing derivative-based methods in the restoration phase.…”
Section: Nonlinearly Constrained Optimizationmentioning
confidence: 99%
“…Problems with smooth constraints (not necessarily convex) and derivative-free objective function were also tackled by Bueno, Friedlander, Martínez & Sobral [11] within the inexact restoration approach, that performed favourably in terms of robustness in comparison with COBYLA and three other benchmarks. For problems with thin domains, defined by computationally inexpensive but highly nonlinear functions, Martínez & Sobral [51] have proposed the algorithm SKINNY, that splits the main iteration into a restoration step, where infeasibility is decreased without evaluating the objective function, followed by the derivative-free minimization on a relaxed feasible set.…”
Section: Derivative-free Optimizationmentioning
confidence: 99%