2022
DOI: 10.1016/j.jspi.2021.10.003
|View full text |Cite
|
Sign up to set email alerts
|

Gibbs posterior inference on multivariate quantiles

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
15
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
2

Relationship

2
7

Authors

Journals

citations
Cited by 13 publications
(15 citation statements)
references
References 32 publications
0
15
0
Order By: Relevance
“…BvM for median and quantiles under classical and Gibbs posterior [4], [5]. Interestingly, [46] show that Bayesian neural networks show inconsistency similar to that discussed in [16], applying variational Bayes leads to BNN becoming consistent; it would be interesting to study whether it is possible to achieve asymptotic efficiency.…”
Section: Discussion and Open Questionsmentioning
confidence: 99%
“…BvM for median and quantiles under classical and Gibbs posterior [4], [5]. Interestingly, [46] show that Bayesian neural networks show inconsistency similar to that discussed in [16], applying variational Bayes leads to BNN becoming consistent; it would be interesting to study whether it is possible to achieve asymptotic efficiency.…”
Section: Discussion and Open Questionsmentioning
confidence: 99%
“…Beyond consistency and rates, there are cases in which the Gibbs posterior enjoys a version of the celebrated Bernstein-von Mises theorem, i.e., that the Gibbs posterior takes on a Gaussian shape asymptotically. This was demonstrated for a special case in Bhattacharya and Martin (2022) but their results are generalized below.…”
Section: Distributional Approximationsmentioning
confidence: 74%
“…In Bhattacharya and Martin (2022), the effect of η on the asymptotic Gibbs posterior mean was overlooked-they stated the posterior mean was θn instead of η θn + (1 − η)θ as in (17). This small effect went unnoticed because the learning rate suggested in the former case ends up being larger than in the latter, hence more conservative Gibbs posterior credible regions.…”
mentioning
confidence: 99%
“…In many cases, it is more natural to formulate the inference problem with a loss function rather than a statistical model. These are often referred to as Gibbs posterior distributions; see Syring and Martin (2017, 2019, 2020b, Bhattacharya and Martin (2022), Wang and Martin (2020), and Section 6 below.…”
Section: Generalized Bayesmentioning
confidence: 99%