2019
DOI: 10.1080/07362994.2019.1566006
|View full text |Cite
|
Sign up to set email alerts
|

Multilevel Monte Carlo in approximate Bayesian computation

Abstract: In the following article we consider approximate Bayesian computation (ABC) inference.We introduce a method for numerically approximating ABC posteriors using the multilevel Monte Carlo (MLMC). A sequential Monte Carlo version of the approach is developed and it is shown under some assumptions that for a given level of mean square error, this method for ABC has a lower cost than i.i.d. sampling from the most accurate ABC approximation.Several numerical examples are given.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 16 publications
(22 citation statements)
references
References 26 publications
0
22
0
Order By: Relevance
“…Guha and Tan (2017) extend the work of Efendiev et al (2015) by replacing the Metropolis-Hasting acceptance probability in a similar way to the MCMC-ABC method (Marjoram et al, 2003). The MLSMC method (Beskos et al, 2017) is exploited to achieve coupling in an ABC context by Jasra et al (2017).…”
Section: Related Workmentioning
confidence: 99%
“…Guha and Tan (2017) extend the work of Efendiev et al (2015) by replacing the Metropolis-Hasting acceptance probability in a similar way to the MCMC-ABC method (Marjoram et al, 2003). The MLSMC method (Beskos et al, 2017) is exploited to achieve coupling in an ABC context by Jasra et al (2017).…”
Section: Related Workmentioning
confidence: 99%
“…Such a benchmark would be significantly more computationally intensive than the comparison of Monte Carlo methods for the forwards problem (Figure 3). The computational statistics literature demonstrates that ABCMCMC [94], ABCSMC [11,121] and ABCMLMC [61,73,144] can be tuned to provide very competitive results on a given inference problem. However, a large number of trial computations are often required to achieve this tuning, or more complex adaptive schemes need to be exploited [30,111].…”
Section: Methodsmentioning
confidence: 99%
“…The formulation of ABCSMC using a sequence of discrepancy thresholds hints that MLMC ideas could also be applicable. Recently, a variety of MLMC methods for ABC have been proposed [61,73,144]. All of these approaches are similar in their application of the multilevel telescoping summation to compute expectations with respect to ABC posterior distributions,…”
Section: Samplers For Approximate Bayesian Computationmentioning
confidence: 99%
See 1 more Smart Citation
“…The method of Beskos et al (2017) has been extended to the computation of normalising constants (Del Moral et al , 2017) and has been applied to other examples, such as non‐local equations (Jasra et al , 2016) and approximate Bayesian computation (Jasra et al , 2019). The method has also been extended to the case that the accuracy of the approximation improves as the dimension of the target grows, that is, infinite dimensional sans-serifE (see Beskos et al , 2018).…”
Section: Approaches For Multilevel Monte Carlo Estimationmentioning
confidence: 99%