2019
DOI: 10.1002/sam.11428
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian variable selection for logistic regression

Abstract: A key issue when using Bayesian variable selection for logistic regression is choosing an appropriate prior distribution. This can be particularly difficult for high‐dimensional data where complete separation will naturally occur in the high‐dimensional space. We propose the use of the Normal‐Gamma prior with recommendations on calibration of the hyper‐parameters. We couple this choice with the use of joint credible sets to avoid performing a search over the high‐dimensional model space. The approach is shown … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 28 publications
0
1
0
Order By: Relevance
“…For “large n , large p ” data-sets, which are now often encountered in some problems in genetics/genomics (such as genetic mapping studies), such algorithms must be carefully designed. In this work, we mainly consider Bayesian variable selection in generalised linear models and survival models and focus on three popular models: the logistic regression model [ 8 , 9 ], the Cox proportional hazards model with partial likelihood [ 10 , 11 , 12 , 13 , 14 ] and the Weibull regression model [ 15 ]. In each case, we illustrate how carefully designed algorithms can facilitate effective posterior computation.…”
Section: Introductionmentioning
confidence: 99%
“…For “large n , large p ” data-sets, which are now often encountered in some problems in genetics/genomics (such as genetic mapping studies), such algorithms must be carefully designed. In this work, we mainly consider Bayesian variable selection in generalised linear models and survival models and focus on three popular models: the logistic regression model [ 8 , 9 ], the Cox proportional hazards model with partial likelihood [ 10 , 11 , 12 , 13 , 14 ] and the Weibull regression model [ 15 ]. In each case, we illustrate how carefully designed algorithms can facilitate effective posterior computation.…”
Section: Introductionmentioning
confidence: 99%