2013
DOI: 10.1145/2480359.2429119
|View full text |Cite
|
Sign up to set email alerts
|

A model-learner pattern for bayesian reasoning

Abstract: A Bayesian model is based on a pair of probability distributions, known as the prior and sampling distributions. A wide range of fundamental machine learning tasks, including regression, classification, clustering, and many others, can all be seen as Bayesian models. We propose a new probabilistic programming abstraction, a typed Bayesian model, which is based on a pair of probabilistic expressions for the prior and sampling distributions. A sampler for a model is an algorithm to compute synthetic data from it… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
27
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
6
2

Relationship

3
5

Authors

Journals

citations
Cited by 20 publications
(27 citation statements)
references
References 25 publications
0
27
0
Order By: Relevance
“…The original semantics of Tabular uses the higher-order model-learner pattern [12], based on a separate metalanguage. Given a Tabular schema S and an input database DB that matches S, our semantics consists of two algorithms.…”
Section: Spreadsheets and Typeful Probabilistic Programmingmentioning
confidence: 99%
“…The original semantics of Tabular uses the higher-order model-learner pattern [12], based on a separate metalanguage. Given a Tabular schema S and an input database DB that matches S, our semantics consists of two algorithms.…”
Section: Spreadsheets and Typeful Probabilistic Programmingmentioning
confidence: 99%
“…Our semantics is compositional: the model of a whole schema is assembled from models of tables, which themselves are composed from models of rows, assembled from models of individual cells. This formulation follows Gordon et al (2013a), with two refinements. First, when we apply a model to data, the model output is semi-observed, that is, each output is a pair consisting of an observed component (like a game outcome in TrueSkill) plus an unobserved latent component (like a performance in TrueSkill).…”
Section: Fun and The Model-learner Patternmentioning
confidence: 99%
“…Compared to the reference learner of Gordon et al (2013a), this new API can learn latent outputs since it works on semi-observed models. Our current implementation uses Infer.NET Fun to compute approximate marginal forms of the posterior distributions on the database parameter and latent database, and persists them to the relational store.…”
Section: A Reference Learner For Query-by-latent-columnmentioning
confidence: 99%
See 1 more Smart Citation
“…Indeed, viewing algorithms as functions allows a smooth integration of distributions into the playground, itself nicely reflected at the level of types through monads [20,40]. As a matter of fact, many existing probabilistic programming languages [37,18] are designed around the λ-calculus or one of its incarnations, like Scheme.…”
Section: Introductionmentioning
confidence: 99%