2016
DOI: 10.1007/s12559-016-9402-z
|View full text |Cite
|
Sign up to set email alerts
|

Discriminative Lasso

Abstract: Lasso-type variable selection has been demonstrated to be effective in handling high-dimensional data. From the biological perspective, traditional Lassotype models are capable of learning which stimuli are valuable while ignoring the many that are not, and thus perform feature selection. Traditional Lasso has the tendency to over-emphasize sparsity and to overlook the correlations between features. These drawbacks have been demonstrated to be critical in limiting its performance on real-world feature selectio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 33 publications
0
4
0
Order By: Relevance
“…[15b,26] Typically, the LRScore-based method successfully selected features and yielded significantly improved diagnostic results compared to the control method (without any feature selection process, p < 0.05, by paired t-test), owing to its utilization of L1 regularization to recognize the predictive metabolic features, which was also promising for other metabolic fingerprints and biomedical applications. [27] Through metabolomics with advanced analytical tools and machine learning, we successfully built a diagnostic model toward ESCC, which held great promise as a valuable complement to current diagnostic methods. Particularly, classical diagnostic strategies for ESCC, such as endoscopy combined with biopsies for histopathological confirmation, are still hampered by its invasiveness and low compliance rates (down to 33.5% [5b] ).…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…[15b,26] Typically, the LRScore-based method successfully selected features and yielded significantly improved diagnostic results compared to the control method (without any feature selection process, p < 0.05, by paired t-test), owing to its utilization of L1 regularization to recognize the predictive metabolic features, which was also promising for other metabolic fingerprints and biomedical applications. [27] Through metabolomics with advanced analytical tools and machine learning, we successfully built a diagnostic model toward ESCC, which held great promise as a valuable complement to current diagnostic methods. Particularly, classical diagnostic strategies for ESCC, such as endoscopy combined with biopsies for histopathological confirmation, are still hampered by its invasiveness and low compliance rates (down to 33.5% [5b] ).…”
Section: Discussionmentioning
confidence: 99%
“…[ 15b,26 ] Typically, the LRScore‐based method successfully selected features and yielded significantly improved diagnostic results compared to the control method (without any feature selection process, p < 0.05, by paired t ‐test), owing to its utilization of L1 regularization to recognize the predictive metabolic features, which was also promising for other metabolic fingerprints and biomedical applications. [ 27 ]…”
Section: Discussionmentioning
confidence: 99%
“…The composite absolute penalties (CAP) family is introduced by Zhao et al [42], which allows given grouping and hierarchical relationships between the variables to be expressed. After observing the limitation of traditional lasso, Zhang et al [43] proposed a discriminative lasso to select features that are strongly correlated with the response and less correlated with each other.…”
Section: Structured Sparsity Algorithmsmentioning
confidence: 99%
“…In this way the MI-trials are embedded as points in a high-dimensional space, with dimensionality equal to the number of sensors. Considering that the abundance of training data is rarely encountered in BCI practice and that computational efficiency is usually sought by working within reduced (transformed) data-spaces, we employ a recently introduced feature selection technique, the discriminative Lasso (dLasso) [142] that defines a subset of non-redundant features with high discriminatory power. dLasso deviates from the lasso-type feature selection process as it introduces stricter nonconvex constrains, while considering both the response-variable and the variable-variable correlations.…”
Section: Feature Extraction/selection and Classificationmentioning
confidence: 99%