2015
DOI: 10.1007/s10898-015-0384-2
|View full text |Cite
|
Sign up to set email alerts
|

Delaunay-based derivative-free optimization via global surrogates, part I: linear constraints

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 14 publications
(13 citation statements)
references
References 27 publications
0
13
0
Order By: Relevance
“…A comparison between the minima of these two search functions is made in order to decide between further sampling (and, therefore, refining) an existing measurement, or sampling at a new point in parameter space. The method developed builds closely on the Delaunay-based Derivative-free Optimization via Global Surrogates algorithm, dubbed ∆-DOGS, proposed in [12][13][14]. Convergence of the algorithm is established in problems for which a.…”
Section: Discussionmentioning
confidence: 99%
See 4 more Smart Citations
“…A comparison between the minima of these two search functions is made in order to decide between further sampling (and, therefore, refining) an existing measurement, or sampling at a new point in parameter space. The method developed builds closely on the Delaunay-based Derivative-free Optimization via Global Surrogates algorithm, dubbed ∆-DOGS, proposed in [12][13][14]. Convergence of the algorithm is established in problems for which a.…”
Section: Discussionmentioning
confidence: 99%
“…This section presents a simplified version of the ∆-DOGS(Z) algorithm, the full version of which is given as Algorithm 2 of [12], where it is analyzed in detail. The ∆-DOGS(Z) algorithm is a grid-based acceleration of the ∆-DOGS algorithm originally developed in [14], and is designed to minimize problems in which precise function evaluations are available, while avoiding an accumulation of unnecessary function evaluations on the boundary of the feasible domain. The optimization problem considered in this section is the minimization of an objective function f (x), approximations of which are assumed to be available, in the feasible domain L = {x|x ∈ R n , a ≤ x ≤ b}.…”
Section: Delaunay-based Optimization Coordinated With a Grid: ∆-Dogs(z)mentioning
confidence: 99%
See 3 more Smart Citations