2019
DOI: 10.1002/sim.8322
|View full text |Cite
|
Sign up to set email alerts
|

Robust semiparametric gene‐environment interaction analysis using sparse boosting

Abstract: For the pathogenesis of complex diseases, gene‐environment (G‐E) interactions have been shown to have important implications. G‐E interaction analysis can be challenging with the need to jointly analyze a large number of main effects and interactions and to respect the “main effects, interactions” hierarchical constraint. Extensive methodological developments on G‐E interaction analysis have been conducted in recent literature. Despite considerable successes, most of the existing studies are still limited as t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
18
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 13 publications
(19 citation statements)
references
References 50 publications
1
18
0
Order By: Relevance
“…For example, Boosting, a popular machine learning method, aggregates multiple weak learners (individual features of weak predictive power for the response variable) into a strong learner (a model of strong predictive power) ( [54,55,56]). Within a regression framework, boosting has strong connections to penalization ( [57−59]), which makes it a natural choice for detecting important G × E interactions ( [60,61]). Support vector machine, another popular machine learning technique which is tightly connected to penalization in the form of "hinge loss + ridge penalty", can also be adopted for G × E interactions ( [62,63]).…”
Section: Other Variable Selection Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…For example, Boosting, a popular machine learning method, aggregates multiple weak learners (individual features of weak predictive power for the response variable) into a strong learner (a model of strong predictive power) ( [54,55,56]). Within a regression framework, boosting has strong connections to penalization ( [57−59]), which makes it a natural choice for detecting important G × E interactions ( [60,61]). Support vector machine, another popular machine learning technique which is tightly connected to penalization in the form of "hinge loss + ridge penalty", can also be adopted for G × E interactions ( [62,63]).…”
Section: Other Variable Selection Methodsmentioning
confidence: 99%
“…Such a connection is the driving force of development of new boosting methods for detecting G × E interactions. As increasing amount of attention has been paid to robustness and hierarchical structure of interaction studies, Wu and Ma ( [61]) has proposed a robust semiparametric sparse Boosting approach for simultaneous identification of both linear and nonlinear interactions. They have adopted the Huber loss function for robustness, and a multiple imputation approach to accommodate missing values in the E factor.…”
Section: Remarks On the Choices Of Penalty Functions Under Model (6)mentioning
confidence: 99%
See 1 more Smart Citation
“…Our model can be potentially extended in the following aspects. First, as data contamination and outliers have been widely observed in repeated measurements, robust variable selection methods in G × E interaction studies [23,[50][51][52] can be extended to longitudinal settings. Second, recently, multiple Bayesian methods have been proposed for pinpointing important G × E interaction effects [53][54][55].…”
Section: Discussionmentioning
confidence: 99%
“…The proposed approach is based on sparse boosting, 19 which demonstrates competitive performances in high-dimensional data analysis compared to penalization and other techniques. 15,20 In all, this study is warranted by providing a practically useful new approach for exploring heterogeneity and commonality across multiple high-dimensional datasets.…”
Section: Introductionmentioning
confidence: 99%