2016
DOI: 10.1007/s10898-016-0433-5
|View full text |Cite
|
Sign up to set email alerts
|

Delaunay-based derivative-free optimization via global surrogates, part II: convex constraints

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 36 publications
0
5
0
Order By: Relevance
“…A comparison between the minima of these two search functions is made in order to decide between further sampling (and, therefore, refining) an existing measurement, or sampling at a new point in parameter space. The method developed builds closely on the Delaunay-based Derivative-free Optimization via Global Surrogates algorithm, dubbed ∆-DOGS, proposed in [12][13][14]. Convergence of the algorithm is established in problems for which a.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…A comparison between the minima of these two search functions is made in order to decide between further sampling (and, therefore, refining) an existing measurement, or sampling at a new point in parameter space. The method developed builds closely on the Delaunay-based Derivative-free Optimization via Global Surrogates algorithm, dubbed ∆-DOGS, proposed in [12][13][14]. Convergence of the algorithm is established in problems for which a.…”
Section: Discussionmentioning
confidence: 99%
“…The best technique for computing the regression is problem dependent. As with [12][13][14], a key advantage of our Delaunay-based approach in the present work is that it facilitates the use of any suitable regression technique, subject to it satisfying the "strict" regression property given in Definition 4. Since our numerical tests all implement the polyharmonic spline regression technique, the derivation of this regression technique is briefly explained in this appendix; additional details may be found in [46].…”
Section: Appendix A: Polyharmonic Spline Regressionmentioning
confidence: 99%
See 2 more Smart Citations
“…Other works stem from research presented in [11,21] whose line-search approach has been extended for box and linearly constrained problems in [20] and [22] while more general constraints are covered in [19]. An interesting work in the field of global optimization is [8]. We refer the reader interested in derivative-based methods, which are not considered in this work, to [7].…”
Section: Introductionmentioning
confidence: 99%