2018
DOI: 10.1080/01621459.2018.1469995
|View full text |Cite
|
Sign up to set email alerts
|

Robust Bayesian Inference via Coarsening

Abstract: The standard approach to Bayesian inference is based on the assumption that the distribution of the data belongs to the chosen model class. However, even a small violation of this assumption can have a large impact on the outcome of a Bayesian procedure. We introduce a simple, coherent approach to Bayesian inference that improves robustness to perturbations from the model: rather than condition on the data exactly, one conditions on a neighborhood of the empirical distribution. When using neighborhoods based o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

6
141
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 149 publications
(147 citation statements)
references
References 31 publications
6
141
0
Order By: Relevance
“…Furthermore, we derive asymptotic settings under which the WABC posterior behaves differently from the posterior by providing upper bounds on concentration as the number of observations goes to ∞ and illustrate the potential effects of model misspecification and the dimension of the observation space. The WABC posterior is a particular case of a coarsened posterior, and our results are complementary to those of Miller and Dunson ().…”
Section: Introductionsupporting
confidence: 86%
See 1 more Smart Citation
“…Furthermore, we derive asymptotic settings under which the WABC posterior behaves differently from the posterior by providing upper bounds on concentration as the number of observations goes to ∞ and illustrate the potential effects of model misspecification and the dimension of the observation space. The WABC posterior is a particular case of a coarsened posterior, and our results are complementary to those of Miller and Dunson ().…”
Section: Introductionsupporting
confidence: 86%
“…Therefore, the WABC distribution with a fixed " does not converge to a Dirac mass, in contrast with the standard posterior. As argued in Miller and Dunson (2018), this can have some benefit in case of model misspecification: the WABC posterior is less sensitive to perturbations of the data-generating process than the standard posterior is.…”
Section: Behaviour As N ! 1 For Fixed "mentioning
confidence: 99%
“…We consider the posterior convergence of the Bayesian fractional posterior obtained by raising the likelihood function by a factor η ∈ (0,1] in the Bayes formula normalΠn,ηfalse(Afalse)=true0Ai=1npffalse(Yifalse|Xifalse)η4ptnormalΠ(df)true0i=1npffalse(Yifalse|Xifalse)η4ptnormalΠ(df),where Π denotes the prior probability measure over L2false(false[0,1false]pfalse): the scriptL2‐space over [0,1] p . Fractional posteriors have gained renewed attention in Bayesian statistics because of their robustness to model misspecification (Grünwald, ; Miller and Dunson, ). According to Walker and Hjort (), the fractional posterior can be viewed as combining the original likelihood function with a data‐dependent prior that is divided by a portion of the likelihood.…”
Section: Theoretical Resultsmentioning
confidence: 99%
“…[0, 1] p /: the L 2 -space over [0, 1] p . Fractional posteriors have gained renewed attention in Bayesian statistics because of their robustness to model misspecification (Grünwald, 2012;Miller and Dunson, 2018). According to Walker and Hjort (2001), the fractional posterior can be viewed as combining the original likelihood function with a data-dependent prior that is divided by a portion of the likelihood.…”
Section: Theoretical Resultsmentioning
confidence: 99%
“…• Bayesian robust inference of mixing measure in finite mixture models has been of interest recently, see for example [Miller and Dunson, 2015]. Whether the idea of minimum Hellinger distance estimator can be adapted to that setting is also an interesting direction to consider in the future.…”
Section: Summaries and Discussionmentioning
confidence: 99%