Bayesian Statistics 9 2011
DOI: 10.1093/acprof:oso/9780199694587.003.0022
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Models for Variable Selection that Incorporate Biological Information*

Abstract: SummaryVariable selection has been the focus of much research in recent years. Bayesian methods have found many successful applications, particularly in situations where the amount of measured variables can be much greater than the number of observations. One such example is the analysis of genomics data. In this paper we first review Bayesian variable selection methods for linear settings, including regression and classification models. We focus in particular on recent prior constructions that have been used … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
14
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(14 citation statements)
references
References 63 publications
0
14
0
Order By: Relevance
“…Specifically, we use continuous structured spike-and-slab priors (George and McCulloch, 1997;Ishwaran et al, 2005;Andersen et al, 2014) within the regression model for a specific layer to (i) blend prior information about the correlation between the genomic markers, and (ii) borrow information about the selected genomic markers from the estimation at one spherical layer and propagate the information through the prior of the model to the subsequent spherical layer(s). As opposed to existing approaches that incorporate apriori structural information (Li and Zhang, 2010;Vannucci and Stingo, 2010), our prior formulation incorporates structural information not only for the dependence between the predictors, but also for the dependence between the multivariate response and across the spherical layers.…”
Section: Statistical Frameworkmentioning
confidence: 99%
“…Specifically, we use continuous structured spike-and-slab priors (George and McCulloch, 1997;Ishwaran et al, 2005;Andersen et al, 2014) within the regression model for a specific layer to (i) blend prior information about the correlation between the genomic markers, and (ii) borrow information about the selected genomic markers from the estimation at one spherical layer and propagate the information through the prior of the model to the subsequent spherical layer(s). As opposed to existing approaches that incorporate apriori structural information (Li and Zhang, 2010;Vannucci and Stingo, 2010), our prior formulation incorporates structural information not only for the dependence between the predictors, but also for the dependence between the multivariate response and across the spherical layers.…”
Section: Statistical Frameworkmentioning
confidence: 99%
“…Many Bayesian methods developed for GWA analysis emphasize that they allow the integration of prior information but few publications have focused on the practicalities of this problem. To our knowledge, the work performed so far has focused on including prior information either at the level of the SNP effects (Fridley et al, 2010;Spencer et al, 2016;Stingo, Chen, Tadesse, & Vannucci, 2011;Vannucci & Stingo, 2010) or at the level of the SNP's probability of association with the outcome (Lewinger, Conti, Baurley, Triche, & Thomas, 2007; Thompson et al, 2013).…”
Section: Introductionmentioning
confidence: 99%
“…() is suboptimal because the method selects the same covariates for all responses. Several priors based on an MRF have been proposed to incorporate biological information on the structured dependence between covariates in variable selection for univariate regression models (Li and Zhang, ; Vannucci and Stingo, ; Stingo et al., ). Our approach differs from these existing MRF models in that we leverage the dependence among multiple outcomes in our prior specification for the variable selection indicators, as opposed to specifying it based on the network structure known a priori among covariates.…”
Section: Introductionmentioning
confidence: 99%