2021
DOI: 10.1080/01621459.2020.1847120
|View full text |Cite
|
Sign up to set email alerts
|

Stochastic Gradient Markov Chain Monte Carlo

Abstract: Markov chain Monte Carlo (MCMC) algorithms are generally regarded as the gold standard technique for Bayesian inference. They are theoretically well-understood and conceptually simple to apply in practice. The drawback of MCMC is that performing exact inference generally requires all of the data to be processed at each iteration of the algorithm. For large datasets, the computational cost of MCMC can be prohibitive, which has led to recent developments in scalable Monte Carlo algorithms that have a significant… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
67
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 90 publications
(68 citation statements)
references
References 51 publications
1
67
0
Order By: Relevance
“…Zhou and Tartakovsky [13] proposed an adaptive MCMC method, in which a transmission model is replaced by a fast and accurate substitutive model that adopts a deep convolutional neural network to reduce calculation burden. For the shortcoming of MCMC, i.e., the huge amount of computation when the dataset is too large, Nemeth and Fearnhead [14] studied the latest development of scalable Monte Carlo algorithm for reducing computational cost of MCMC. In this paper, we focus on the stochastic gradient MCMC (SGMCMC), which uses the data subsampling technique to reduce the cost of each iteration of MCMC and to achieve good results.…”
Section: Introductionmentioning
confidence: 99%
“…Zhou and Tartakovsky [13] proposed an adaptive MCMC method, in which a transmission model is replaced by a fast and accurate substitutive model that adopts a deep convolutional neural network to reduce calculation burden. For the shortcoming of MCMC, i.e., the huge amount of computation when the dataset is too large, Nemeth and Fearnhead [14] studied the latest development of scalable Monte Carlo algorithm for reducing computational cost of MCMC. In this paper, we focus on the stochastic gradient MCMC (SGMCMC), which uses the data subsampling technique to reduce the cost of each iteration of MCMC and to achieve good results.…”
Section: Introductionmentioning
confidence: 99%
“…Since we have introduced the randomness through the reparameterization trick in (3.9), as a remedy, we further exploit stochastic gradients instead to maximize the ELBO L. Stochastic gradients [14,18,145,151,163] are unbiased estimators of the exact gradients [5,15,129], but can be computed efficiently when the exact gradients cannot be calculated directly. In our case, we first draw a sample e from φ(e).…”
Section: Variational Bayesian Inference For Slammentioning
confidence: 99%
“…The gradients in (4.15) are referred to as stochastic gradients, and they are the unbiased expectations of the exact quantities in (4.13) [5,15,129]. With the stochastic gradients, we optimize the ELBO L in (4.12) iteratively.…”
Section: Stochastic Variational Bayesian Inference For Full Slam Problemmentioning
confidence: 99%
See 2 more Smart Citations