2017
DOI: 10.1007/s10596-017-9657-9
|View full text |Cite
|
Sign up to set email alerts
|

Distributed Gauss-Newton optimization method for history matching problems with multiple best matches

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
7
2

Relationship

1
8

Authors

Journals

citations
Cited by 34 publications
(6 citation statements)
references
References 59 publications
0
6
0
Order By: Relevance
“…The under-performance of plain, ensemble-based algorithms in handling multi-modal variables is also discussed in the literature, see, for example, Elsheikh et al (2013); Gao et al (2017Gao et al ( , 2018; of kernel parameters that are associated with the center points (note that different clusters of training data share the same set of center points). The corrected predictions are then taken as biased outputs plus certain residual terms, whereas the latter are calculated as the weighted averages of the residuals predicted using the kernel parameters in each cluster.…”
Section: Results With Respect To Multi-modal Inputsmentioning
confidence: 99%
“…The under-performance of plain, ensemble-based algorithms in handling multi-modal variables is also discussed in the literature, see, for example, Elsheikh et al (2013); Gao et al (2017Gao et al ( , 2018; of kernel parameters that are associated with the center points (note that different clusters of training data share the same set of center points). The corrected predictions are then taken as biased outputs plus certain residual terms, whereas the latter are calculated as the weighted averages of the residuals predicted using the kernel parameters in each cluster.…”
Section: Results With Respect To Multi-modal Inputsmentioning
confidence: 99%
“…While EXPLO2 is parallelized (at marginal cost to performance), to the best of our knowledge no competing technique is except for * -CMA-ES. Indeed, as (Gao et al, 2017) points out, most surrogate-based derivative-free optimizersincluding NEWUOA-require sequential function evaluations. Though parallel algorithms exist (Haftka et al, 2016;Rehbach et al, 2018;Xia & Shoemaker, 2020), these are relatively few in number outside the context of Bayesian optimization (which is unsuitable for high-dimensional problems), and parallel techniques appropriate for highdimensional problems have been considered in our design and/or benchmarking.…”
Section: Parallel Performancementioning
confidence: 99%
“…While EXPLO2 is parallelized (at marginal cost to performance), to the best of our knowledge no competing technique is except for *-CMA-ES. Indeed, as [6] points out, most surrogate-based derivativefree optimizers-including NEWUOA-require sequential function evaluations. Though parallel algorithms exist [7,18,25], these are relatively few in number outside the context of Bayesian optimization (which is unsuitable for high-dimensional problems), and parallel techniques appropriate for high-dimensional problems have been considered in our design and/or benchmarking.…”
Section: Parallel Performancementioning
confidence: 99%