2022
DOI: 10.1111/rssb.12500
|View full text |Cite
|
Sign up to set email alerts
|

Robust Generalised Bayesian Inference for Intractable Likelihoods

Abstract: Generalised Bayesian inference updates prior beliefs using a loss function, rather than a likelihood, and can therefore be used to confer robustness against possible mis‐specification of the likelihood. Here we consider generalised Bayesian inference with a Stein discrepancy as a loss function, motivated by applications in which the likelihood contains an intractable normalisation constant. In this context, the Stein discrepancy circumvents evaluation of the normalisation constant and produces generalised post… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
20
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 20 publications
(21 citation statements)
references
References 50 publications
1
20
0
Order By: Relevance
“…The proof of Theorem 1 is provided in Appendix A.2. The result was established using similar arguments from early work by Hooker and Vidyashankar (2014); Ghosh and Basu (2016) and extended techniques of Miller (2021); Matsubara et al (2022). See also the recent review of Bochkina (2022).…”
Section: A Generalised Posteriorsupporting
confidence: 52%
See 3 more Smart Citations
“…The proof of Theorem 1 is provided in Appendix A.2. The result was established using similar arguments from early work by Hooker and Vidyashankar (2014); Ghosh and Basu (2016) and extended techniques of Miller (2021); Matsubara et al (2022). See also the recent review of Bochkina (2022).…”
Section: A Generalised Posteriorsupporting
confidence: 52%
“…To circumvent both computation of the normalising constant and simulation from the statistical model, Matsubara et al (2022) proposed a generalised Bayesian posterior, called KSD-Bayes, which is based on a Stein discrepancy. The resulting generalised posterior is consistent and asymptotically normal, and thus shares many of the properties of the standard Bayesian posterior whilst admitting a form which does not require the computation of an intractable normalisation constant.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…iv) This study has focused on estimation and computational methods proposed in the generalised Bayesian framework, in particular using robust divergence. On the other hand, methods using the Maximum Mean Discrepancy (Chérief-Abdellatif and Alquier, 2020) and Kernel Stein Discrepancy (Matsubara et al, 2022) have also been proposed in recent years in the same generalised Bayesian framework, although not with the motivation of dealing with outliers. Both require adjustment of the hyperparameters of the kernel used, and it may be possible to estimate them using our proposed method and compare their performance.…”
Section: Hertzsprung-russell Star Cluster Datamentioning
confidence: 99%