2018
DOI: 10.1016/j.csda.2018.02.009
|View full text |Cite
|
Sign up to set email alerts
|

Multilevel rejection sampling for approximate Bayesian computation

Abstract: Likelihood-free methods, such as approximate Bayesian computation, are powerful tools for practical inference problems with intractable likelihood functions. Markov chain Monte Carlo and sequential Monte Carlo variants of approximate Bayesian computation can be effective techniques for sampling posterior distributions in an approximate Bayesian computation setting. However, without careful consideration of convergence criteria and selection of proposal kernels, such methods can lead to very biased inference or… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
41
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
6
1

Relationship

4
3

Authors

Journals

citations
Cited by 27 publications
(43 citation statements)
references
References 48 publications
2
41
0
Order By: Relevance
“…As a result, we apply an ABC variant of a Markov chain Monte Carlo (MCMC) sampler (Marjoram et al, 2003) (see Appendix C). While we find that the ABC MCMC sampler works well for this problem, other advanced ABC-based Monte Carlo schemes are also possible, such as sequential Monte Carlo (Sisson et al, 2007) and multilevel Monte Carlo (Warne et al, 2018). Using the ABC MCMC sampler, the four parameter posterior marginal PDFs derived from Equation (20) are estimated using the same data and observation error as in Section 4.2.…”
Section: Inference On the Generalised Porous Fisher Modelmentioning
confidence: 99%
“…As a result, we apply an ABC variant of a Markov chain Monte Carlo (MCMC) sampler (Marjoram et al, 2003) (see Appendix C). While we find that the ABC MCMC sampler works well for this problem, other advanced ABC-based Monte Carlo schemes are also possible, such as sequential Monte Carlo (Sisson et al, 2007) and multilevel Monte Carlo (Warne et al, 2018). Using the ABC MCMC sampler, the four parameter posterior marginal PDFs derived from Equation (20) are estimated using the same data and observation error as in Section 4.2.…”
Section: Inference On the Generalised Porous Fisher Modelmentioning
confidence: 99%
“…Such a benchmark would be significantly more computationally intensive than the comparison of Monte Carlo methods for the forwards problem (Figure 3). The computational statistics literature demonstrates that ABCMCMC [94], ABCSMC [11,121] and ABCMLMC [61,73,144] can be tuned to provide very competitive results on a given inference problem. However, a large number of trial computations are often required to achieve this tuning, or more complex adaptive schemes need to be exploited [30,111].…”
Section: Methodsmentioning
confidence: 99%
“…The formulation of ABCSMC using a sequence of discrepancy thresholds hints that MLMC ideas could also be applicable. Recently, a variety of MLMC methods for ABC have been proposed [61,73,144]. All of these approaches are similar in their application of the multilevel telescoping summation to compute expectations with respect to ABC posterior distributions,…”
Section: Samplers For Approximate Bayesian Computationmentioning
confidence: 99%
See 2 more Smart Citations