2018
DOI: 10.1002/sim.7821
|View full text |Cite
|
Sign up to set email alerts
|

Sparse partial least squares with group and subgroup structure

Abstract: Integrative analysis of high dimensional omics datasets has been studied by many authors in recent years. By incorporating prior known relationships among the variables, these analyses have been successful in elucidating the relationships between different sets of omics data. In this article, our goal is to identify important relationships between genomic expression and cytokine data from a human immunodeficiency virus vaccine trial. We proposed a flexible partial least squares technique, which incorporates gr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 11 publications
(6 citation statements)
references
References 40 publications
(114 reference statements)
0
6
0
Order By: Relevance
“…In the present paper, we limited ourselves to lasso-based selection regression in Cox models. Other approaches like component-wise boosting [35] and Sparse Partial Least Squares (Sparse PLS) [36] could be considered, but are out of the scope of the present work. Similarly, while we only considered frequentist approaches, the a priori knowledge about biomarker groups could fit a Bayesian approach [37] with a hierarchical structured variable selection (HSVS) method.…”
Section: Tablementioning
confidence: 99%
“…In the present paper, we limited ourselves to lasso-based selection regression in Cox models. Other approaches like component-wise boosting [35] and Sparse Partial Least Squares (Sparse PLS) [36] could be considered, but are out of the scope of the present work. Similarly, while we only considered frequentist approaches, the a priori knowledge about biomarker groups could fit a Bayesian approach [37] with a hierarchical structured variable selection (HSVS) method.…”
Section: Tablementioning
confidence: 99%
“…Thus our generalizations rst established in Section 2.2, and expanded upon in Discussionaccomodate: almost any data type, various metrics (e.g., Hellinger distance), various optimizations (e.g., PLS, CCA, or RDA type optmizations), and even two strategies for ridge-like regularization. We have foregone any discussions of inference, stability, and resampling for PLS-CA-R because, as a generalization of PLS-R, many inference and stability approaches still applysuch as feature selection or sparsication (Sutton et al 2018), additional regularization or sparsication approaches (Le Floch et al 2012, Guillemot et al 2019, Tenenhaus et al 2014, Tenenhaus & Tenenhaus 2011, cross-validation (Wold et al 1987, Rodríguez-Pérez et al 2018, Kvalheim et al 2019, Abdi 2010, permutation (Berry et al 2011), various bootstrap (Efron 1979, Chernick 2008 approaches (Abdi 2010, Takane & Jung 2009 or tests (McIntosh & Lobaugh 2004, Krishnan et al 2011, and other frameworks such as split-half resampling (Strother et al 2002, Kovacevic et al 2013, Strother et al 2004)and are easily adapted for the PLS-CA-R and GPLS frameworks.…”
Section: Discussionmentioning
confidence: 99%
“…For sparse models, other parameters must also be adjusted by cross-validation and, more specifically, the leave-one-out method in the context of PLS. Recently, a sparse PLS approach (denoted as sgPLS for sparse group PLS and introduced in Sutton et al (2018); Liquet et al (2016)) has used bootstrap approach in Broc et al (2021) to efficiently overcome the difficulty of the "large p, small n" multi-block framework to select optimal models. Recently, a meta-analysis of sparse PLS methodologies was performed by Mehmood et al (2020) over 16 different types of strategies divided in 3 classes which are "filter", "wrapper" and "embedded" methods.…”
Section: Introductionmentioning
confidence: 99%