2020
DOI: 10.48550/arxiv.2012.10754
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Bambi: A simple interface for fitting Bayesian linear models in Python

Abstract: The popularity of Bayesian statistical methods has increased dramatically in recent years across many research areas and industrial applications. This is the result of a variety of methodological advances with faster and cheaper hardware as well as the development of new software tools. Here we introduce an open source Python package named Bambi (BAyesian Model Building Interface) that is built on top of the PyMC3 probabilistic programming framework and the ArviZ package for exploratory analysis of Bayesian mo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(6 citation statements)
references
References 18 publications
0
6
0
Order By: Relevance
“…To investigate the influence of individual prediction tendencies on speech tracking under different noise conditions, we used Bayesian multilevel regression models with Bambi (Capretto et al, 2022), a python package built on top of the PyMC3 package (Salvatier et al, 2016) for probabilistic programming. The correlation between predicted brain activity from speech envelope encoding and true brain activity was used as dependent variable, and separate models were calculated per voxel using the following formula according to the Wilkinson notation (Wilkinson & Rogers, 1973): To investigate the influence of higher-level probabilistic structure of speech, we also calculated a model for which the dependent variable only included cortical tracking (i. e. speech envelope encoding) results of lexically identical nouns of high vs. low surprisal: Before entering the models, prediction_tendency was zero-centered (note that in these interaction models other predictors are assumed to be zero to estimate effects of one predictor) and the number of distractors (0-2) was treated as a continuous variable.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…To investigate the influence of individual prediction tendencies on speech tracking under different noise conditions, we used Bayesian multilevel regression models with Bambi (Capretto et al, 2022), a python package built on top of the PyMC3 package (Salvatier et al, 2016) for probabilistic programming. The correlation between predicted brain activity from speech envelope encoding and true brain activity was used as dependent variable, and separate models were calculated per voxel using the following formula according to the Wilkinson notation (Wilkinson & Rogers, 1973): To investigate the influence of higher-level probabilistic structure of speech, we also calculated a model for which the dependent variable only included cortical tracking (i. e. speech envelope encoding) results of lexically identical nouns of high vs. low surprisal: Before entering the models, prediction_tendency was zero-centered (note that in these interaction models other predictors are assumed to be zero to estimate effects of one predictor) and the number of distractors (0-2) was treated as a continuous variable.…”
Section: Methodsmentioning
confidence: 99%
“…The correlation between predicted brain activity from speech envelope encoding and true brain activity was used as dependent variable, and separate models were calculated per voxel using the following formula according to the Wilkinson notation (Wilkinson & Rogers, 1973): To investigate the influence of higher-level probabilistic structure of speech, we also calculated a model for which the dependent variable only included cortical tracking (i. e. speech envelope encoding) results of lexically identical nouns of high vs. low surprisal: Before entering the models, prediction_tendency was zero-centered (note that in these interaction models other predictors are assumed to be zero to estimate effects of one predictor) and the number of distractors (0-2) was treated as a continuous variable. As priors we used the weakly-or non-informative default priors of Bambi (Capretto et al, 2022). For a summary of model parameters we report regression coefficients and 94% high density intervals (HDI) of the posterior distribution.…”
Section: Methodsmentioning
confidence: 99%
“…To investigate the relationship between e.g. age and aperiodic activity recorded using MEG, we used bayesian generalized linear models (GLMs) either build directly in PyMC (a python package for probabilistic programming; (Salvatier et al, 2016)) or in Bambi (a high-level interface to PyMC; (Capretto et al, 2022)). Decisions for either Bambi or PyMC were made based on the accessibility of appropriate statistical families for the respective dependent variables in a GLM.…”
Section: Methodsmentioning
confidence: 99%
“…To estimate this model, we used Bayesian methods, which can flexibly estimate the parameters of non-standard models. Models were estimated using the Bambi (Capretto et al, 2022) and PyMC3 (Salvatier et al, 2016) packages in Python. Before fitting the model, both predictors were z-score transformed, and the dependent variable (morph percentage) was scaled to the interval (0, 1) using the method suggested by Smithson and Verkuilen (2006).…”
Section: Regression Analytic Strategymentioning
confidence: 99%