2016
DOI: 10.1111/rssb.12158
|View full text |Cite
|
Sign up to set email alerts
|

A General Framework for Updating Belief Distributions

Abstract: SummaryWe propose a framework for general Bayesian inference. We argue that a valid update of a prior belief distribution to a posterior can be made for parameters which are connected to observations through a loss function rather than the traditional likelihood function, which is recovered as a special case. Modern application areas make it increasingly challenging for Bayesians to attempt to model the true data‐generating mechanism. For instance, when the object of interest is low dimensional, such as a mean… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
501
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 314 publications
(504 citation statements)
references
References 56 publications
3
501
0
Order By: Relevance
“…Using the same proof of Lemma 4 for inequality (8) in Lemma 3, we obtain with probability at least 1 − /2, ∈ (0, 1), for any distributionπ that…”
Section: Lemmamentioning
confidence: 93%
See 1 more Smart Citation
“…Using the same proof of Lemma 4 for inequality (8) in Lemma 3, we obtain with probability at least 1 − /2, ∈ (0, 1), for any distributionπ that…”
Section: Lemmamentioning
confidence: 93%
“…Here, for computational reasons, we replace the likelihood by a pseudo-likelihood. This is an increasingly popular method in Bayesian statistics [8] and in machine learning [16,2,7]. We define the pseudo-posterior bỹ…”
Section: Peudo-bayesian Estimationmentioning
confidence: 99%
“…Using an off-the-shelf method, poor mixing may follow, particularly if latent variables are sampled explicitly [16]. Alternatives include estimating the likelihood function within a MCMC step [1]; using only summary statistics implied by draws from the model [4]; using surrogate likelihood functions [6,25]. While much of the motivation for the latter is to avoid a complete specification of a model, which calls for explicit assumptions on nuisance parameters, computational considerations are also invoked.…”
Section: Contributionmentioning
confidence: 99%
“…We generate synthetic models with 10 latent variables and three variables per cluster, and 5 ordinal levels for each observed variable (a total of 30 observed variables) 6 .…”
Section: Synthetic Experimentsmentioning
confidence: 99%
“…In this paper, instead, the definition of conditional probability is solely based on a set of axioms. This idea was developed by reference [2] who defined a framework for general Bayesian inference and discussed its application to important statistical problems. The present paper also addresses the issue of calibrating the conditional probability with non-stochastic information.…”
Section: Introductionmentioning
confidence: 99%