2020
DOI: 10.1017/s0960129519000082
|View full text |Cite
|
Sign up to set email alerts
|

A channel-based perspective on conjugate priors

Abstract: A desired closure property in Bayesian probability is that an updated posterior distribution be in the same class of distributions – say Gaussians – as the prior distribution. When the updating takes place via a statistical model, one calls the class of prior distributions the ‘conjugate priors’ of the model. This paper gives (1) an abstract formulation of this notion of conjugate prior, using channels, in a graphical language, (2) a simple abstract proof that such conjugate priors yield Bayesian inversions an… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 18 publications
(9 citation statements)
references
References 16 publications
0
9
0
Order By: Relevance
“…The former has been extensively studied, going back to (Lawvere, 1962) and spanning dozens of papers (Corfield, 2021). The latter is less popular, but has seen a number of papers in recent years studying causality (Fong, 2013), conjugate priors (Jacobs, 2017) and general aspects of Bayesian learning Sturtz, 2014, 2013).…”
Section: Applications Successes and Motivationmentioning
confidence: 99%
See 1 more Smart Citation
“…The former has been extensively studied, going back to (Lawvere, 1962) and spanning dozens of papers (Corfield, 2021). The latter is less popular, but has seen a number of papers in recent years studying causality (Fong, 2013), conjugate priors (Jacobs, 2017) and general aspects of Bayesian learning Sturtz, 2014, 2013).…”
Section: Applications Successes and Motivationmentioning
confidence: 99%
“…• Probabilistic Machine Learning. The study of updating a distribution with samples from a dataset and reasoning about uncertainty arising from noisy measurements Sturtz, 2014, 2013;Jacobs, 2017).…”
Section: Applications Successes and Motivationmentioning
confidence: 99%
“…All of our examples FinStoch, BorelStoch and Gauss have conditionals [5], [11]. For Gauss, this captures the selfconjugacy of Gaussians [19]. An explicit formula generalizing (4) is given in [11], but we shall only require the existence of conditionals and work with their universal property.…”
Section: Definition Iii3 ([11]mentioning
confidence: 99%
“…The Bayesian approach to learning can handle additional data via conditioning. This will be made precise below, using a logical formulation that is characteristic for conjugate priors, see [13]. For numbers n and i ∈ n we write 1 {i} : n → [0, 1] for the singleton predicate that is 1 on i ∈ n and 0 elsewhere.…”
Section: Prerequisites On Continuous Probabilitymentioning
confidence: 99%