2018
DOI: 10.1007/s12532-018-0144-7
|View full text |Cite
|
Sign up to set email alerts
|

RBFOpt: an open-source library for black-box optimization with costly function evaluations

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
167
0
2

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 139 publications
(176 citation statements)
references
References 42 publications
1
167
0
2
Order By: Relevance
“…Within tasks of the same type, we follow a first come, first served policy. Whenever a task of type 1 is completed, it yields a new interpolation point that is added to the set S. Whenever a task of type 2 is completed, we check if the newly determined search point is to be discarded because of several criteria also employed in the serial version of the optimization algorithm (see [14]), and if the search point is accepted, we queue a task of type 1 to evaluate f at it. An undesirable event can in principle occur if, while f is being evaluated at a point y, the same point y is generated as a search point by concurrent tasks of type 2, and therefore the evaluation of f(y) is performed multiple times.…”
Section: Parallelization Of Rbfoptmentioning
confidence: 99%
“…Within tasks of the same type, we follow a first come, first served policy. Whenever a task of type 1 is completed, it yields a new interpolation point that is added to the set S. Whenever a task of type 2 is completed, we check if the newly determined search point is to be discarded because of several criteria also employed in the serial version of the optimization algorithm (see [14]), and if the search point is accepted, we queue a task of type 1 to evaluate f at it. An undesirable event can in principle occur if, while f is being evaluated at a point y, the same point y is generated as a search point by concurrent tasks of type 2, and therefore the evaluation of f(y) is performed multiple times.…”
Section: Parallelization Of Rbfoptmentioning
confidence: 99%
“…An example of two surrogate models that interpolate four data points. The solid‐line model is more likely than the dashed‐line one since it is less bumpy (Costa and Nannicini, ).…”
Section: Surrogate‐based Subproblemsmentioning
confidence: 99%
“…In Jakobsson et al. (), the authors show that the optimal error term e can be obtained by solving an explicit linear system: μ1μIn×nBTABe=BTABf̃.A similar approach to deal with noise is introduced in Costa and Nannicini (). In this paper, the range within which function values are allowed to vary is required to specify.…”
Section: Surrogate‐based Subproblemsmentioning
confidence: 99%
See 2 more Smart Citations