2014
DOI: 10.1007/s11222-014-9525-6
|View full text |Cite
|
Sign up to set email alerts
|

Pre-processing for approximate Bayesian computation in image analysis

Abstract: Most of the existing algorithms for approximate Bayesian computation (ABC) assume that it is feasible to simulate pseudo-data from the model at each iteration. However, the computational cost of these simulations can be prohibitive for high dimensional data. An important example is the Potts model, which is commonly used in image analysis. Images encountered in real world applications can have millions of pixels, therefore scalability is a major concern. We apply ABC with a synthetic likelihood to the hidden P… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
38
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
4
4

Relationship

1
7

Authors

Journals

citations
Cited by 31 publications
(38 citation statements)
references
References 31 publications
0
38
0
Order By: Relevance
“…Hence the amount of time required by ABC is dominated by the simulations of y via the Swedsen-Wang algorithm. This motivated Moores, Mengersen, and Robert (2014) to propose a cut down on the cost of running an ABC experiment by removing the simulation of an image from hidden Potts model, and replacing it by an approximate simulation of the summary statistics. Another alternative is the clever sampler of Mira et al (2001) that provides exact simulations of Ising models and can be extended to Potts models.…”
Section: Numerical Resultsmentioning
confidence: 99%
“…Hence the amount of time required by ABC is dominated by the simulations of y via the Swedsen-Wang algorithm. This motivated Moores, Mengersen, and Robert (2014) to propose a cut down on the cost of running an ABC experiment by removing the simulation of an image from hidden Potts model, and replacing it by an approximate simulation of the summary statistics. Another alternative is the clever sampler of Mira et al (2001) that provides exact simulations of Ising models and can be extended to Potts models.…”
Section: Numerical Resultsmentioning
confidence: 99%
“…By replacing S(w) with our surrogate model, we avoid the need to simulate auxiliary variables during model fitting. In Moores et al [2015], it was shown that this approximation could lead to two orders of magnitude improvement in the elapsed runtime for fitting the hidden Potts model, while also improving the convergence properties of the original ABC-SMC algorithm. This is known as nonparametric Bayesian indirect likelihood with summary statistics, or nsBIL .…”
Section: Bayesian Indirect Likelihoodmentioning
confidence: 99%
“…While exact inference is theoretically possible, its applicability is limited to datasets with fewer than a thousand pixels. Comparisons such as McGrory et al [2009], Moores and Mengersen [2014] and Moores et al [2015] have shown that auxiliary variable methods such as AEA and ABC are infeasible for the scale of data that is regularly encountered in image analysis.…”
Section: Introductionmentioning
confidence: 99%
“…For the inference of the posterior distribution π(θ | y n ), for large n, an MCMC algorithm would be virtually impossible to implement because it would require evaluating the likelihood, and rejection ABC (R-ABC) is time consuming because simulating a (synthetic) data set from the likelihood (2.5) is computationally costly. However, the synthetic likelihood (SL-ABC) and bootstrap likelihood (BL-ABC) approaches to ABC have proven to be successful in providing inference from the posterior distribution π(θ | y n ) of the Bayesian Potts model with relatively low computational cost [see 188,285]. the general SL-ABC and BL-ABC methods are described in Section 4.…”
Section: Hidden Potts Modelmentioning
confidence: 99%
“…ABC methods have been developed to estimate various Bayesian models with intractable likelihood, including alpha-stable [199], bivariate beta distribution [57], coalescent [e.g., 253,87], copula [106,107,108], differential equation [e.g., 258], ecological [e.g., 283,91], epidemic [e.g., 180,181,182,150], extreme value [81,80], financial [e.g., 199], hidden Markov [e.g., 130], hydrological [e.g., 193], image analysis [e.g., 193,188], network analysis [e.g., 273,92], order-restricted [140], population evolution [e.g., 173], quantile distribution [e.g., 6,73,211], spatial process [239], species abundance distribution [e.g., 124], state-space [16,265], stationary process [8], statistical relational learning [63], susceptible-infectedremoved (SIR) [258], and time-series models [e.g., 129,128]. ABC methods have also been developed for optimal Bayesian designs [76,112], reinforcement learning [70], and the estimation of intractable integrated likelihoods [105] and for approximate maximum likelihood estimation [e.g., 235,…”
Section: Introductionmentioning
confidence: 99%