2019
DOI: 10.1214/18-ba1121
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Acquisition Rules for Model-Based Approximate Bayesian Computation

Abstract: Approximate Bayesian computation (ABC) is a method for Bayesian inference when the likelihood is unavailable but simulating from the model is possible. However, many ABC algorithms require a large number of simulations, which can be costly. To reduce the computational cost, Bayesian optimisation (BO) and surrogate models such as Gaussian processes have been proposed. Bayesian optimisation enables one to intelligently decide where to evaluate the model next but common BO strategies are not designed for the goal… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

3
84
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 50 publications
(87 citation statements)
references
References 38 publications
3
84
0
Order By: Relevance
“…Therefore, this random displacement can be drawn from a Gaussian distribution with a full-width-at-half-maximum set by the current estimation of the posterior contours (most simply approximated by using the desired number (e. g., two) of sigma from the 1D marginalised distributions; see ref. [56] for discussion and comparison of deterministic and stochastic acquisition rules). In general, these "exploration" terms should be tuned to the size of the credible region which is required to be well estimated.…”
Section: Acquisition Functionmentioning
confidence: 99%
See 1 more Smart Citation
“…Therefore, this random displacement can be drawn from a Gaussian distribution with a full-width-at-half-maximum set by the current estimation of the posterior contours (most simply approximated by using the desired number (e. g., two) of sigma from the 1D marginalised distributions; see ref. [56] for discussion and comparison of deterministic and stochastic acquisition rules). In general, these "exploration" terms should be tuned to the size of the credible region which is required to be well estimated.…”
Section: Acquisition Functionmentioning
confidence: 99%
“…This naturally dovetails into Gaussian process emulation, which provides a robust estimate of uncertainty in predictions across the parameter space [51]. The details of the balance between exploration and exploitation are encoded in the acquisition function used to determine future proposals, of which many examples have been developed (we use a novel expansion of the GP-UCB acquisition function) [52][53][54][55][56]. We also show how to propose multiple training samples simultaneously (batch acquisition).…”
Section: Introductionmentioning
confidence: 99%
“…As pointed out by Järvenpää et al . (2017), in Bayesian optimisation for approximate Bayesian computation, the goal should not be to find the minimum of J(θ), but rather to minimise the expected uncertainty in the estimate of the approximate posterior over the future evaluation of the simulator at θ .…”
Section: Expected Integrated Variancementioning
confidence: 72%
“…In high dimension, the integral can become prohibitively expensive to compute on a grid. As discussed by Järvenpää et al . (2017), it can then be evaluated with Monte Carlo or quasi-Monte Carlo methods such as importance sampling.…”
Section: Expected Integrated Variancementioning
confidence: 80%
See 1 more Smart Citation