2016
DOI: 10.48550/arxiv.1603.08232
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

The block-Poisson estimator for optimally tuned exact subsampling MCMC

Abstract: Speeding up Markov Chain Monte Carlo (MCMC) for datasets with many observations by data subsampling has recently received considerable attention in the literature.The currently available methods are either approximate, highly inefficient or limited to small dimensional models. We propose a pseudo-marginal MCMC method that estimates the likelihood by data subsampling using a block-Poisson estimator. The estimator is a product of Poisson estimators, each based on an independent subset of the observations. The co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
33
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
2

Relationship

3
4

Authors

Journals

citations
Cited by 12 publications
(33 citation statements)
references
References 35 publications
0
33
0
Order By: Relevance
“…Typically a small fraction of observations are included, hence speeding up the execution time significantly compared to MH. However, the augmentation scheme severely affects the mixing of the Firefly algorithm and it has been demonstrated to perform poorly compared to MH and other subsampling approaches (Bardenet et al, 2015;Quiroz et al, 2016Quiroz et al, , 2017. We conclude that delayed acceptance, out of the discussed methods, seems to be the only feasible route to obtain exact simulation via data subsampling.…”
Section: Introductionmentioning
confidence: 94%
See 2 more Smart Citations
“…Typically a small fraction of observations are included, hence speeding up the execution time significantly compared to MH. However, the augmentation scheme severely affects the mixing of the Firefly algorithm and it has been demonstrated to perform poorly compared to MH and other subsampling approaches (Bardenet et al, 2015;Quiroz et al, 2016Quiroz et al, , 2017. We conclude that delayed acceptance, out of the discussed methods, seems to be the only feasible route to obtain exact simulation via data subsampling.…”
Section: Introductionmentioning
confidence: 94%
“…Other authors use a subsample of the data in each MCMC iteration to speed up the algorithm, see e.g. Korattikara et al (2014), Bardenet et al (2014), Maclaurin and Adams (2014), Maire et al (2015), Bardenet et al (2015) and Quiroz et al (2016Quiroz et al ( , 2017. Finally, delayed acceptance MCMC has been used to speed up computations (Banterle et al, 2014;Payne and Mallick, 2015).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…To reduce the computational complexity of HMC and improve its scalability to large data sets, Welling and Teh (2011) suggested to use stochastic estimates of the gradient of the likelihood. Many recent articles describe the possibility of such sub-sampling combined with MCMC (Quiroz et al, 2019(Quiroz et al, , 2017(Quiroz et al, , 2016Flegal, 2012;Pillai and Smith, 2014), where unbiased likelihood estimates are obtained from subsamples of the whole data set in such a way that ergodicity and the desired limiting properties of the MCMC algorithm are maintained. These methods are not part of the current implementation of BGNLM, but our approach can relatively easily be adapted to allow sub-sampling MCMC techniques.…”
Section: Summary and Discussionmentioning
confidence: 99%
“…These minibatch MH methods can be divided into two classes, exact and inexact, depending on whether or not the target distribution π is necessarily preserved. Inexact methods introduce asymptotic bias to the target distribution, trading off correctness for speedups [6,16,20,21,23]. Exact methods either require impractically strong constraints on the target distribution [18,24], limiting their applicability in practice, or they negatively impact efficiency, counteracting the speedups that minibatching aims to provide in the first place [4,10].…”
Section: Introductionmentioning
confidence: 99%