2017
DOI: 10.1145/3158148
|View full text |Cite
|
Sign up to set email alerts
|

Denotational validation of higher-order Bayesian inference

Abstract: We present a modular semantic account of Bayesian inference algorithms for probabilistic programming languages, as used in data science and machine learning. Sophisticated inference algorithms are often explained in terms of composition of smaller parts. However, neither their theoretical justification nor their implementation reflects this modularity. We show how to conceptualise and analyse such inference algorithms as manipulating intermediate representations of probabilistic programs using higher-order fun… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
63
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
5
3
1

Relationship

2
7

Authors

Journals

citations
Cited by 59 publications
(63 citation statements)
references
References 24 publications
0
63
0
Order By: Relevance
“…Using the same argument as above, we can restrict this operator as follows 2) . As stated above, all the information about the semantics of observe(sample(normal(x, 1))) is contained in the Köthe dual of this operator, which, through the Riesz Representation and Functional Representations natural transformation, can be typed modulo isomorphism as (M − (N (−, 1))) σ : (MR) N (0, Here the Bayesian inverse of our original probability kernel appears explicitly, showing that our semantics indeed captures the notion of Bayesian inverse.…”
Section: ) Conditionals and While Loopsmentioning
confidence: 99%
See 1 more Smart Citation
“…Using the same argument as above, we can restrict this operator as follows 2) . As stated above, all the information about the semantics of observe(sample(normal(x, 1))) is contained in the Köthe dual of this operator, which, through the Riesz Representation and Functional Representations natural transformation, can be typed modulo isomorphism as (M − (N (−, 1))) σ : (MR) N (0, Here the Bayesian inverse of our original probability kernel appears explicitly, showing that our semantics indeed captures the notion of Bayesian inverse.…”
Section: ) Conditionals and While Loopsmentioning
confidence: 99%
“…Related works: Two very powerful semantics for higherorder probabilistic programming have been recently developed in the literature. In [1], [2], a semantics is given in terms of so-called quasi-Borel spaces. These form a Cartesian closed category and admit a notion of probability distribution and of a Giry-like monad of probability distributions.…”
Section: Introductionmentioning
confidence: 99%
“…Such engines have clearly formulated correctness conditions from Markov chain theory [Geyer 2011;Green 1995;Hastings 1970;Metropolis et al 1953], such as ergodicity and correct stationarity. Tools from formal semantics have been employed to show that the inference engines satisfy these conditions [Borgström et al 2016;Hur et al 2015;Kiselyov 2016;Scibior et al 2018]. While looking at different algorithms, some of these work consider more expressive languages than what we used in the paper, in particular, those supporting higherorder functions.…”
Section: Related Workmentioning
confidence: 99%
“…This is an active research topic [73] but formalization faces serious issues. For one, there are incompatibilities with the standard measure-theoretic foundation of probability theory, which may require rethinking how probability theory is formulated [11,70,69,34,68]. First-order logic is among the most studied formal languages, making it easy to use a first-order knowledge base with various software.…”
Section: Bayesian Higher-order Probabilistic Programmingmentioning
confidence: 99%