2007
DOI: 10.1007/s10898-007-9234-1
|View full text |Cite
|
Sign up to set email alerts
|

Nonsmooth optimization through Mesh Adaptive Direct Search and Variable Neighborhood Search

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
102
0

Year Published

2012
2012
2017
2017

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 156 publications
(108 citation statements)
references
References 32 publications
0
102
0
Order By: Relevance
“…Figure 3 is developed as a general representation of both direct-search and trust-region search which will be described next. The disadvantages of local direct-search CDFO are the high dependence on the initial point, entrapment within closest local optimum, and large number of function calls to guarantee convergence (Audet et al, 2008b;Conn et al, 2009b). In order to increase the probability for convergence to the global optimum, multi-start approaches are proposed, which are not efficient in cases where the model of interest is computationally expensive.…”
Section: Global Optimization Advances In Cdfomentioning
confidence: 99%
See 1 more Smart Citation
“…Figure 3 is developed as a general representation of both direct-search and trust-region search which will be described next. The disadvantages of local direct-search CDFO are the high dependence on the initial point, entrapment within closest local optimum, and large number of function calls to guarantee convergence (Audet et al, 2008b;Conn et al, 2009b). In order to increase the probability for convergence to the global optimum, multi-start approaches are proposed, which are not efficient in cases where the model of interest is computationally expensive.…”
Section: Global Optimization Advances In Cdfomentioning
confidence: 99%
“…Allowing pattern-search methods to handle a dense set of polling directions instead of a finite set of polling directions was a significant development for studying the convergence of generally constrained derivative-free problems which are Lipschitz even near a limit point (Audet and Dennis, 2006;Audet et al, 2008b). A recent methodology proposed in (Vicente and Custódio, 2012) studies convergence of local-search adaptive search in the case of discontinuous objective functions and general constraints, which is useful for many real applications.…”
Section: Global Optimization Advances In Cdfomentioning
confidence: 99%
“…Search step evaluate the functions on a finite number of points of M (k, ∆ k ) Poll step compute p MADS directions D k ∈ R n×p construct the frame P k ⊆ M (k, ∆ k ) with x k , D k , and ∆ k evaluate the functions on the p points of P k [2] Updates determine the type of success of iteration k solution update (x k+1 ) mesh update (∆ k+1 ) k ← k + 1 check the stopping conditions, goto [1] (see [18]). A corollary of this result is that without constraints and if f is strictly differentiable, then ∇ f (x) = 0.…”
Section: Variable Neighborhood Search (Vns) Strategymentioning
confidence: 99%
“…Although the MADS algorithm is quite powerful in local search, it needs some special considerations to be successful in global search too. Variable Neighborhood Search (VNS) is an algorithm integrated into NOMAD to improve global search of the design space [25]. While the default value of VNS budget is equal to 75% of overall budget, two more global search budgets have been also investigated using VNS equal to 85% and 95% of the overall budget.…”
Section: Nomadmentioning
confidence: 99%