2007
DOI: 10.1137/050635432
|View full text |Cite
|
Sign up to set email alerts
|

Implementing Generating Set Search Methods for Linearly Constrained Minimization

Abstract: Abstract. We discuss an implementation of a derivative-free generating set search method for linearly constrained minimization with no assumption of nondegeneracy placed on the constraints. The convergence guarantees for generating set search methods require that the set of search directions possesses certain geometrical properties that allow it to approximate the feasible region near the current iterate. In the hard case, the calculation of the search directions corresponds to finding the extreme rays of a co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
89
0
8

Year Published

2011
2011
2021
2021

Publication Types

Select...
4
3
3

Relationship

1
9

Authors

Journals

citations
Cited by 101 publications
(97 citation statements)
references
References 20 publications
0
89
0
8
Order By: Relevance
“…This algorithm combines the positive features of the Constrained Particle Swarm Optimizer (CPSO) [28], Generating Set Search (GSS) [29], and the Complex [30]. Each iteration of the algorithm consists of three steps: (i) a search step corresponding to a population update of a revised CPSO, (ii) an optional (skipped if the CPSO improves the best solution found so far)…”
Section: Single-objective Optimizationmentioning
confidence: 99%
“…This algorithm combines the positive features of the Constrained Particle Swarm Optimizer (CPSO) [28], Generating Set Search (GSS) [29], and the Complex [30]. Each iteration of the algorithm consists of three steps: (i) a search step corresponding to a population update of a revised CPSO, (ii) an optional (skipped if the CPSO improves the best solution found so far)…”
Section: Single-objective Optimizationmentioning
confidence: 99%
“…Observe that the the initialization of 60 in Algorithm 4.1 and the definition of 0 in (13), together with the update rule (15), ensure that for all k 2 0, As a practical matter, in the implementation of Algorithm 3.1 discussed in [15] the directions in G are normalized, so Pmax = 1, which simplifies both (15) and (16).…”
Section: The Derivative-free Stopping Criterionmentioning
confidence: 99%
“…Algorithmic developments and convergence have been studied for box-constrained problems (BCDFO) (Lewis and Torczon, 1999;Lucidi and Sciandrone, 2002;Audet and Dennis, 2002;Garcia-Palomares et al, 2013); for known linear constrained problems (Lewis and Torczon, 2000;Lewis et al, 2007;Audet and Dennis, 2002;Kolda et al, 2006) (Liuzzi et al, 2010); smoothed exact l −∞ penalty function (Liuzzi and Lucidi, 2009) and exact penalty merit functions (Fasano et al, 2014;Gratton and Vicente, 2014), or restoration steps which are independent from objective function (Martinez and Sobral, 2013;Arouxét et al, 2015). Allowing pattern-search methods to handle a dense set of polling directions instead of a finite set of polling directions was a significant development for studying the convergence of generally constrained derivative-free problems which are Lipschitz even near a limit point (Audet and Dennis, 2006;Audet et al, 2008b).…”
Section: Global Optimization Advances In Cdfomentioning
confidence: 99%