2016
DOI: 10.7287/peerj.preprints.1686
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Probabilistic programming in Python using PyMC3

Abstract: Probabilistic Programming allows for automatic Bayesian inference on user-defined probabilistic models. Recent advances in Markov chain Monte Carlo (MCMC) sampling allow inference on increasingly complex models. This class of MCMC, known as Hamiltonian Monte Carlo, requires gradient information which is often not readily available. PyMC3 is a new open source Probabilistic Programming framework written in Python that uses Theano to compute gradients via automatic differentiation as well as compile probabilistic… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
696
0

Year Published

2016
2016
2019
2019

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 597 publications
(696 citation statements)
references
References 3 publications
0
696
0
Order By: Relevance
“…Despite this complication, it is relatively straightforward to fit the RSM as a Bayesian model using a probabilistic programming framework, such as BUGS, JAGS, or Stan. For the models in this paper, we used the PyMC3 Python package (Patil et al, 2010; Salvatier et al, 2015), which is built on the Theano deep learning package (Bastien et al, 2012; Bergstra et al, 2010; version 0.9.0.dev2) and implements the state-of-the-art No U-Turn MCMC Sampler (Hoffman & Gelman, 2014). An alpha version of our NiPyMC analysis package is available online (https://github.com/PsychoinformaticsLab/nipymc; DOI, 10.5281/zenodo.168087; Yarkoni & Westfall, 2016).…”
Section: Methodsmentioning
confidence: 99%
“…Despite this complication, it is relatively straightforward to fit the RSM as a Bayesian model using a probabilistic programming framework, such as BUGS, JAGS, or Stan. For the models in this paper, we used the PyMC3 Python package (Patil et al, 2010; Salvatier et al, 2015), which is built on the Theano deep learning package (Bastien et al, 2012; Bergstra et al, 2010; version 0.9.0.dev2) and implements the state-of-the-art No U-Turn MCMC Sampler (Hoffman & Gelman, 2014). An alpha version of our NiPyMC analysis package is available online (https://github.com/PsychoinformaticsLab/nipymc; DOI, 10.5281/zenodo.168087; Yarkoni & Westfall, 2016).…”
Section: Methodsmentioning
confidence: 99%
“…Hence, in total, there are three-level comparisons in our experiment, i.e., model comparison; prior comparison; and uncertainty comparison. The entire experiment is written in Python, using Theano [34], and Pymc3 [35] libraries. The partial code to produce this study is available from: https://github.com/LeonBai/Uncertainty-Flow.…”
Section: Methodsmentioning
confidence: 99%
“…It was simulated in PyMC3 using the Automatic Differentiation Variational Inference (ADVI) algorithm with 100,000 draws, which is stable and converges on the posterior distribution in 10.76 s on a middle-range laptop computer. Although the mathematical notation may seem intimidating to practitioners who are not used to it, writing this in the probabilistic Python programming package PyMC3 [43] demonstrates the intuitive nature of such a model: It is important to note that no probability statements about the values inside the frequentist interval can be made, nor can one fit a distribution to the interval. The distribution indicated is strictly a Bayesian one.…”
Section: Bayesian Solutionmentioning
confidence: 99%