We propose Subsampling MCMC, a Markov Chain Monte Carlo (MCMC) framework where the likelihood function for n observations is estimated from a random subset of m observations. We introduce a highly efficient unbiased estimator of the loglikelihood based on control variates, such that the computing cost is much smaller than that of the full log-likelihood in standard MCMC. The likelihood estimate is bias-corrected and used in two dependent pseudo-marginal algorithms to sample from a perturbed posterior, for which we derive the asymptotic error with respect to n and m, respectively. We propose a practical estimator of the error and show that the error is negligible even for a very small m in our applications. We demonstrate that Subsampling MCMC is substantially more efficient than standard MCMC in terms of sampling efficiency for a given computational budget, and that it outperforms other subsampling methods for MCMC proposed in the literature.
We propose the Bayesian adaptive Lasso (BaLasso) for variable selection and coefficient estimation in linear regression. The BaLasso is adaptive to the signal level by adopting different shrinkage for different coefficients. Furthermore, we provide a model selection machinery for the BaLasso by assessing the posterior conditional mode estimates, motivated by the hierarchical Bayesian interpretation of the Lasso. Our formulation also permits prediction using a model averaging strategy. We discuss other variants of this new approach and provide a unified framework for variable selection using flexible penalties. Empirical evidence of the attractiveness of the method is demonstrated via extensive simulation studies and data analysis.
Variational Bayes (VB) is rapidly becoming a popular tool for Bayesian inference in statistical modeling. However, the existing VB algorithms are restricted to cases where the likelihood is tractable, which precludes the use of VB in many interesting situations such as in state space models and in approximate Bayesian computation (ABC), where application of VB methods was previously impossible. This paper extends the scope of application of VB to cases where the likelihood is intractable, but can be estimated unbiasedly.The proposed VB method therefore makes it possible to carry out Bayesian inference in many statistical applications, including state space models and ABC. The method is generic in the sense that it can be applied to almost all statistical models without requiring too much model-based derivation, which is a drawback of many existing VB algorithms. We also show how the proposed method can be used to obtain highly accurate VB approximations of marginal posterior distributions.
The Linear Ballistic Accumulator (LBA) model of Brown and Heathcote (2008) is used as a measurement tool to answer questions about applied psychology. These analyses involve parameter estimation and model selection, and modern approaches use hierarchical Bayesian methods and Markov chain Monte Carlo (MCMC) to estimate the posterior distribution of the parameters. Although there are a range of approaches used for model selection, they are all based on the posterior samples produced via MCMC, which means that the model selection inferences inherit properties of the MCMC sampler. We address these constraints by proposing two new approaches to the Bayesian estimation of the hierarchical LBA model. Both methods are qualitatively different from all existing approaches, and are based on recent advances in particle-based Monte-Carlo methods. The first approach is based on particle MCMC, using Metropolis-within-Gibbs steps and the second approach uses a version of annealed importance sampling. Both methods have important differences from all existing methods, including greatly improved sampling efficiency and parallelisability for high-performance computing. An important further advantage of our annealed importance sampling algorithm is that an estimate of the marginal likelihood is obtained as a byproduct of sampling.This makes it straightforward to then apply model selection via Bayes factors. The new approaches we develop provide opportunities to apply the LBA model with greater confidence than before, and to extend its use to previously intractable cases. We illustrate the proposed methods with pseudo-code, and by application to simulated and real datasets.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.