2020
DOI: 10.1002/sta4.258
|View full text |Cite
|
Sign up to set email alerts
|

Asymptotic consistency of loss‐calibrated variational Bayes

Abstract: This paper establishes the asymptotic consistency of the loss-calibrated variational Bayes (LCVB) method. LCVB was proposed in [12] as a method for approximately computing Bayesian posteriors in a 'loss aware' manner. This methodology is also highly relevant in general data-driven decision-making contexts. Here, we not only establish the asymptotic consistency of the calibrated approximate posterior, but also the asymptotic consistency of decision rules. We also establish the asymptotic consistency of decision… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
2
1

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 16 publications
0
3
0
Order By: Relevance
“…However, leveraging these to establish the PAC-Bayes bounds for the KL-VB posterior is a challenging effort that we leave to future papers. Finally it is of interest to generalize our PAC-bounds to posterior approximations beyond KL-variational inference, such as -Rényi posterior approximations [ 6 ], and loss-calibrated posterior approximations [ 24 , 25 ].…”
Section: Discussionmentioning
confidence: 99%
“…However, leveraging these to establish the PAC-Bayes bounds for the KL-VB posterior is a challenging effort that we leave to future papers. Finally it is of interest to generalize our PAC-bounds to posterior approximations beyond KL-variational inference, such as -Rényi posterior approximations [ 6 ], and loss-calibrated posterior approximations [ 24 , 25 ].…”
Section: Discussionmentioning
confidence: 99%
“…However, leveraging these to establish the PAC-Bayes bounds for the KL-VB posterior is a challenging effort that we leave to future papers. Finally it is of interest to generalize our PAC-bounds to posterior approximations beyond KLvariational inference, such as α-Rényi posterior approximations [19], and loss-calibrated posterior approximations [18,14].…”
Section: Discussionmentioning
confidence: 99%
“…In the online setting, online variational approximations are studied by [31,32] and led to the first scaling of Bayesian principles to state-of-the-art neural networks [38]. In the i.i.d setting, a series of paper established the first theoretical results on variational inference, for some of them through a connection with PAC-Bayes bounds [5,43,20,16,14,47,4,48,46,29,49,15]. Up to our knowledge, the only regret bound for online variational inference can be found in [17].…”
Section: Related Workmentioning
confidence: 99%