2018
DOI: 10.1109/tpami.2017.2776267
|View full text |Cite
|
Sign up to set email alerts
|

Safe Feature Screening for Generalized LASSO

Abstract: Solving Generalized LASSO (GL) problems is challenging, particularly when analyzing many features with a complex interacting structure. Recent developments have found effective ways to identify inactive features so that they can be removed or aggregated to reduce the problem size before applying optimization solvers for learning. However, existing methods are mostly devoted to special cases of GL problems with special structures for feature interactions, such as chains or trees. Developing screening rules, par… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 15 publications
(9 citation statements)
references
References 14 publications
0
9
0
Order By: Relevance
“…And this LASSO model method could decrease the characteristic dimension. Then, a multivariable Cox regression model to select driven genes that were most closely associated with survival was constructed and six methylation-driven genes were retained (22,23).…”
Section: Identification and Validation Of Methylation-driven Genes For Crcmentioning
confidence: 99%
“…And this LASSO model method could decrease the characteristic dimension. Then, a multivariable Cox regression model to select driven genes that were most closely associated with survival was constructed and six methylation-driven genes were retained (22,23).…”
Section: Identification and Validation Of Methylation-driven Genes For Crcmentioning
confidence: 99%
“…A strong correlation often exists between the variables, which is suggestive of high dimensionality and collinearity, and this method could decrease the characteristic dimension [20]. Then, we built a multivariate Cox regression model to select the genes that were most tightly associated with survival [21]. In addition, we validated this model in subgroups based on different characteristics.…”
Section: Methodsmentioning
confidence: 99%
“…Zeng and Cheung [89] applied a sparse-promoting penalty in learning feature weights in an unsupervised setting. Ren et al [57] investigated feature selection using a generalized LASSO model. In [3] the idea of annealing was combined with feature selection using a sparsity favoring loss function.…”
Section: Feature Redundancymentioning
confidence: 99%