We propose a novel blocked version of the continuous-time bouncy particle sampler of Bouchard-Côté et al. (J Am Stat Assoc 113(522):855–867, 2018) which is applicable to any differentiable probability density. This alternative implementation is motivated by blocked Gibbs sampling for state-space models (Singh et al. in Biometrika 104(4):953–969, 2017) and leads to significant improvement in terms of effective sample size per second, and furthermore, allows for significant parallelization of the resulting algorithm. The new algorithms are particularly efficient for latent state inference in high-dimensional state-space models, where blocking in both space and time is necessary to avoid degeneracy of MCMC. The efficiency of our blocked bouncy particle sampler, in comparison with both the standard implementation of the bouncy particle sampler and the particle Gibbs algorithm of Andrieu et al. (J R Stat Soc Ser B Stat Methodol 72(3):269–342, 2010), is illustrated numerically for both simulated data and a challenging real-world financial dataset.
We propose a novel blocked version of the continuous-time bouncy particle sampler Bouchard-Côté et al. [2018] applicable to any differentiable probability density. Motivated by Singh et al.[2017], we also introduce an alternative implementation that leads to significant improvement in terms of effective sample size per second, and furthermore allows for parallelization at the cost of an extra logarithmic factor. The new algorithms are particularly efficient for latent state inference in high-dimensional state space models, where blocking in both space and time is necessary to avoid degeneracy of the proposal kernel. The efficiency of the blocked bouncy particle sampler, in comparison with both the standard implementation of the bouncy particle sampler and the particle Gibbs algorithm of Andrieu et al. [2010], is illustrated numerically for both simulated data and a challenging real-world financial dataset.
The use of non-differentiable priors in Bayesian statistics has become increasingly popular, in particular in Bayesian imaging analysis. Current state of the art methods are approximate in the sense that they replace the posterior with a smooth approximation via Moreau-Yosida envelopes, and apply gradient-based discretized diffusions to sample from the resulting distribution. We characterize the error of the Moreau-Yosida approximation and propose a novel implementation using underdamped Langevin dynamics. In misson-critical cases, however, replacing the posterior with an approximation may not be a viable option. Instead, we show that Piecewise-Deterministic Markov Processes (PDMP) can be utilized for exact posterior inference from distributions satisfying almost everywhere differentiability. Furthermore, in contrast with * Joint first authors. JVG acknowledges financial support from an EPSRC Doctoral Training Award. TS acknowledges financial support from the Cantab Capital Institute for the Mathematics of Information.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.