2020
DOI: 10.3390/e22030329
|View full text |Cite
|
Sign up to set email alerts
|

A Two-Stage Mutual Information Based Bayesian Lasso Algorithm for Multi-Locus Genome-Wide Association Studies

Abstract: Genome-wide association study (GWAS) has turned out to be an essential technology for exploring the genetic mechanism of complex traits. To reduce the complexity of computation, it is well accepted to remove unrelated single nucleotide polymorphisms (SNPs) before GWAS, e.g., by using iterative sure independence screening expectation-maximization Bayesian Lasso (ISIS EM-BLASSO) method. In this work, a modified version of ISIS EM-BLASSO is proposed, which reduces the number of SNPs by a screening methodology bas… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 32 publications
0
3
0
Order By: Relevance
“…Today, it is well established that mutual information is an appropriate metric to measure the association between SNPs and qualitative (case–control) phenotypes [ 39 , 44 , 46 , 74 , 75 , 76 , 77 ]. However, we apply here for the first time this metric to quantitative traits.…”
Section: Resultsmentioning
confidence: 99%
“…Today, it is well established that mutual information is an appropriate metric to measure the association between SNPs and qualitative (case–control) phenotypes [ 39 , 44 , 46 , 74 , 75 , 76 , 77 ]. However, we apply here for the first time this metric to quantitative traits.…”
Section: Resultsmentioning
confidence: 99%
“…The larger the mutual information value is, the stronger the correlation between the two random variables. In GWAS, the phenotypic vector is considered one random variable, and the genotype vector is another random variable [41].…”
Section: Mutual Informationmentioning
confidence: 99%
“…Regression models provide useful frameworks for multivariate mutual information analysis that uses information concepts when choosing covariates (also called features) that are important for the analysis and prediction. A recent article that includes both the concept of mutual information and the Lasso is [ 1 ]. This paper develops properties of methods that use the information in a vector X to reduce prediction error, that is, to reduce entropy.…”
Section: Introductionmentioning
confidence: 99%