2018
DOI: 10.1080/01621459.2018.1473776
|View full text |Cite
|
Sign up to set email alerts
|

Frequentist Consistency of Variational Bayes

Abstract: A key challenge for modern Bayesian statistics is how to perform scalable inference of posterior distributions. To address this challenge, variational Bayes (VB) methods have emerged as a popular alternative to the classical Markov chain Monte Carlo (MCMC) methods. VB methods tend to be faster while achieving comparable predictive performance. However, there are few theoretical results around VB. In this paper, we establish frequentist consistency and asymptotic normality of VB methods. Specifically, we connec… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

9
117
2

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 143 publications
(128 citation statements)
references
References 78 publications
9
117
2
Order By: Relevance
“…Because the ELBO is convex with respect to each of the variational factors, the ELBO is guaranteed to converge to a local optimum (Boyd and Vandenberghe, 2004). Moreover, an important result from the frequentist perspective is the variational Bernstein-von Mises theorem, which states that under benign conditions, the mean-field variational Bayes estimateθ = θ q * (θ )dθ is consistent (Wang and Blei, 2018).…”
Section: Figure 1: Schematic Representations Of Markov Chain Monte Camentioning
confidence: 99%
“…Because the ELBO is convex with respect to each of the variational factors, the ELBO is guaranteed to converge to a local optimum (Boyd and Vandenberghe, 2004). Moreover, an important result from the frequentist perspective is the variational Bernstein-von Mises theorem, which states that under benign conditions, the mean-field variational Bayes estimateθ = θ q * (θ )dθ is consistent (Wang and Blei, 2018).…”
Section: Figure 1: Schematic Representations Of Markov Chain Monte Camentioning
confidence: 99%
“…This proposition builds on [19,Theorem 5], which shows that modulo Assumptions 3.1, 3.3, and 3.4, the NVB approximate posterior distribution is asymptotically consistent(see Corollary 3.1). Using the consistency of q * (θ X n ) and Assumption 3.2(1), we first establish the pointwise convergence of H q * (a) to H 0 (a).…”
Section: Analysis Of the Nvb Decision Rulementioning
confidence: 79%
“…The primary result in this section shows that the loss-calibrated approximate posterior q * a (θ X n ) for any a ∈ A is consistent and converges to the Dirac-delta distribution at θ 0 . We establish the frequentist consistency of LCVB approximate posterior, extending and building on the results in [19].…”
Section: Consistency Of the Lcvb Approximate Posteriormentioning
confidence: 85%
See 1 more Smart Citation
“…This method will converge to a local maximum of the evidence lower bound (Lange, 2013). Wang and Blei (2017) show that under certain conditions, Bayesian point estimators extracted from the variational Bayes solution are consistent. …”
Section: Mean-field Variational Approximationmentioning
confidence: 99%