2020
DOI: 10.1016/j.csda.2019.106881
|View full text |Cite
|
Sign up to set email alerts
|

A novel Bayesian approach for variable selection in linear regression models

Abstract: We propose a novel Bayesian approach to the problem of variable selection in multiple linear regression models. In particular, we present a hierarchical setting which allows for direct specification of a-priori beliefs about the number of nonzero regression coefficients as well as a specification of beliefs that given coefficients are nonzero. To guarantee numerical stability, we adopt a g-prior with an additional ridge parameter for the unknown regression coefficients. In order to simulate from the joint post… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(9 citation statements)
references
References 43 publications
0
9
0
Order By: Relevance
“…The hierarchical modelling framework was implemented in JAGS (Plummer, 2013a), using the package rjags in R (Plummer, 2013b;R Core Team, 2013), which enabled both the estimation of parameter values from prior distributions with MCMC and the generation of model-averaged predictions. The MCMC sampling had three parallel chains with 25 000 iterations for each chain.…”
Section: Model Evaluation and Implementationmentioning
confidence: 99%
See 1 more Smart Citation
“…The hierarchical modelling framework was implemented in JAGS (Plummer, 2013a), using the package rjags in R (Plummer, 2013b;R Core Team, 2013), which enabled both the estimation of parameter values from prior distributions with MCMC and the generation of model-averaged predictions. The MCMC sampling had three parallel chains with 25 000 iterations for each chain.…”
Section: Model Evaluation and Implementationmentioning
confidence: 99%
“…1.00 and 0.34 for pre-event NDVI and event NDVI, respectively, for DON in cluster 2). The BMA can handle the collinearity by shrinking the posterior distribution of inclusion probability of one of the correlated variables towards zero (Nakagawa and Freckleton, 2011;Posch et al, 2020;Walker, 2019). This shrinkage effect leads to a lower posterior probability of a more complex model that includes correlated variables because each extra predictor dilutes the prior density of the existing predictor that it correlates with.…”
Section: Key Drivers Of Temporal Variability In Water Qualitymentioning
confidence: 99%
“…• Bayes linear selection (BLS). Recently, Posch et al (2020) proposed a Bayesian variable selection approach for linear regression models. Since our approach shares some common features (e.g.…”
Section: Discussionmentioning
confidence: 99%
“…Following Posch et al (2020) to specify the proposal for the random set A at first a Bernoulli distributed random variable c h is introduced…”
Section: Details Stepmentioning
confidence: 99%
See 1 more Smart Citation