2020
DOI: 10.1111/biom.13355
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian variable selection for non‐Gaussian responses: a marginally calibrated copula approach

Abstract: We propose a new highly flexible and tractable Bayesian approach to undertake variable selection in non‐Gaussian regression models. It uses a copula decomposition for the joint distribution of observations on the dependent variable. This allows the marginal distribution of the dependent variable to be calibrated accurately using a nonparametric or other estimator. The family of copulas employed are “implicit copulas” that are constructed from existing hierarchical Bayesian models widely used for variable selec… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 36 publications
(78 reference statements)
0
1
0
Order By: Relevance
“…This setting that combines the g-prior on regression coefficients with a binomial prior on model space, is the major workhorse model for implementing Bayesian variable selection. The theoretical underpinning of Bayesian variable selection are well-understood in linear regression with both Gaussian (Hoeting et al, 1999) and non-Gaussian (Kundu and Dunson, 2014;Klein and Smith, 2021) errors, as well as nonparametric regression (Smith and Kohn, 1996;Kohn et al, 2001). At the same time, variable selection with the g-prior provides the ground for some of the most interesting Bayesian work on computation in high-dimensional settings.…”
Section: Model Choice With Many Models: Bayesian Model Averagingmentioning
confidence: 99%
“…This setting that combines the g-prior on regression coefficients with a binomial prior on model space, is the major workhorse model for implementing Bayesian variable selection. The theoretical underpinning of Bayesian variable selection are well-understood in linear regression with both Gaussian (Hoeting et al, 1999) and non-Gaussian (Kundu and Dunson, 2014;Klein and Smith, 2021) errors, as well as nonparametric regression (Smith and Kohn, 1996;Kohn et al, 2001). At the same time, variable selection with the g-prior provides the ground for some of the most interesting Bayesian work on computation in high-dimensional settings.…”
Section: Model Choice With Many Models: Bayesian Model Averagingmentioning
confidence: 99%