2015
DOI: 10.18637/jss.v068.i04
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Model Averaging Employing Fixed and Flexible Priors: TheBMSPackage forR

Abstract: This article describes the BMS (Bayesian model sampling) package for R that implements Bayesian model averaging for linear regression models. The package excels in allowing for a variety of prior structures, among them the "binomial-beta" prior on the model space and the so-called "hyper-g" specifications for Zellner's g prior. Furthermore, the BMS package allows the user to specify her own model priors and offers a possibility of subjective inference by setting "prior inclusion probabilities" according to the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
170
0
2

Year Published

2016
2016
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 174 publications
(173 citation statements)
references
References 39 publications
1
170
0
2
Order By: Relevance
“…The key words used to search in CRAN were Model Selection , Variable Selection , Bayes Factor and Averaging . Conducting this search on February 17, 2017, we found a total of 13 packages (the version in parentheses): VarSelectIP(0.2‐1); spikeslab(1.1.5) (Ishwaran et al , ); spikeslabGAM(1.1‐11) (Scheipl, ); ensembleBMA(5.1.3) (Fraley et al , ); dma(1.2‐3) (McCormick et al , ); BMA(3.18.6) (Raftery et al , ); mglmn(0.0.2) (Katabuchi & Nakamura, ); varbvs(2.0‐8) (Carbonetto & Stephens, ); INLABMA(0.1‐8) (Bivand et al , ); Bayesian adaptive sampling (BAS) (1.4.3) (Clyde, ); BayesFactor(0.9.12‐2) (Morey et al , ); BayesVarSel(1.7.1) (Garcia‐Donato & Forte, ); BMS(0.3.4) (Zeugner & Feldkircher, ) and mombf(1.8.3) (Rossell et al , ). As suggested by a referee, we also searched for related packages in rseek.org and in the CRAN task view devoted to Bayesian Inference, where the packages monomvn(1.9‐7) (Gramacy, ) and BoomSpikeSlab(0.7.0) (Scott, ) also appeared.…”
Section: Cran Packages Screeningmentioning
confidence: 99%
“…The key words used to search in CRAN were Model Selection , Variable Selection , Bayes Factor and Averaging . Conducting this search on February 17, 2017, we found a total of 13 packages (the version in parentheses): VarSelectIP(0.2‐1); spikeslab(1.1.5) (Ishwaran et al , ); spikeslabGAM(1.1‐11) (Scheipl, ); ensembleBMA(5.1.3) (Fraley et al , ); dma(1.2‐3) (McCormick et al , ); BMA(3.18.6) (Raftery et al , ); mglmn(0.0.2) (Katabuchi & Nakamura, ); varbvs(2.0‐8) (Carbonetto & Stephens, ); INLABMA(0.1‐8) (Bivand et al , ); Bayesian adaptive sampling (BAS) (1.4.3) (Clyde, ); BayesFactor(0.9.12‐2) (Morey et al , ); BayesVarSel(1.7.1) (Garcia‐Donato & Forte, ); BMS(0.3.4) (Zeugner & Feldkircher, ) and mombf(1.8.3) (Rossell et al , ). As suggested by a referee, we also searched for related packages in rseek.org and in the CRAN task view devoted to Bayesian Inference, where the packages monomvn(1.9‐7) (Gramacy, ) and BoomSpikeSlab(0.7.0) (Scott, ) also appeared.…”
Section: Cran Packages Screeningmentioning
confidence: 99%
“…The following explication for the case study is based on Zeugner [38]. In the BMA of the case study, the prior probability concerns the intuition of the uncertainty analyst how probable she believes a model M γ might be before looking at the data.…”
Section: Theory: a Definition Of Uncertaintymentioning
confidence: 99%
“…First, an adequate model that respects the body of evidence must be chosen. To this end, BMA is applied [38][39][40]. The method yields posterior inclusion probabilities (PIP) for all candidate explanatory variables, i.e.…”
Section: Theory: a Definition Of Uncertaintymentioning
confidence: 99%
“…We used multiple regression with Bayesian model averaging to identify the subset of predictors (plant GEBVs) that best explained variation in caterpillar performance GEBVs, while accounting for uncertainty in the effects of each covariate including which covariates have non-zero effects. The multiple regression models were fit with the package (package version 0.3.4, version 3.4.2; Zeugner & Feldkircher, 2015). Zellner’s g-prior was used for the regression coefficients with g = N , where N is the number of observations ( N = 94; Zellner, 1986), and a uniform prior was used for the different models (i.e., sets of covariates with non-zero effects; Zeugner & Feldkircher, 2015).…”
Section: Methodsmentioning
confidence: 99%