2013
DOI: 10.1109/tit.2012.2227680
|View full text |Cite
|
Sign up to set email alerts
|

How Correlations Influence Lasso Prediction

Abstract: We study how correlations in the design matrix influence Lasso prediction. First, we argue that the higher the correlations are, the smaller the optimal tuning parameter is. This implies in particular that the standard tuning parameters, that do not depend on the design matrix, are not favorable. Furthermore, we argue that Lasso prediction works well for any degree of correlations if suitable tuning parameters are chosen. We study these two subjects theoretically as well as with simulations.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
77
0
1

Year Published

2014
2014
2023
2023

Publication Types

Select...
6
3

Relationship

2
7

Authors

Journals

citations
Cited by 95 publications
(80 citation statements)
references
References 23 publications
2
77
0
1
Order By: Relevance
“…Least absolute shrinkage and selection operator regression provided a robust approach for specifying a predictive model from many candidate explanatory variables. Least absolute shrinkage and selection operator is known to be effective when explanatory variables are correlated (Dormann et al, 2013;Hebiri & Lederer, 2013), and so is particularly well-suited to the present data set, in which a few pairs of variables (e.g. TP and TN, TN and DOC, and TN and chl a) are strongly correlated (r > 0.7).…”
Section: Discussionmentioning
confidence: 99%
“…Least absolute shrinkage and selection operator regression provided a robust approach for specifying a predictive model from many candidate explanatory variables. Least absolute shrinkage and selection operator is known to be effective when explanatory variables are correlated (Dormann et al, 2013;Hebiri & Lederer, 2013), and so is particularly well-suited to the present data set, in which a few pairs of variables (e.g. TP and TN, TN and DOC, and TN and chl a) are strongly correlated (r > 0.7).…”
Section: Discussionmentioning
confidence: 99%
“…It has been discussed that adaptive lasso as a regularization approach gives more stable coefficient estimates in multicollinearity scenarios than those from OLS or ordinary lasso (Tibshirani, 1996; Zou, 2006; Hebiri and Lederer, 2013), as sparsity is imposed at different levels on each neighborhood. In contrast, for methods like PC-stable and MMHC, sparsity is utilized for the partial correlations as a whole view.…”
Section: Two Stage Solution Search Algorithmmentioning
confidence: 99%
“…On the other hand, the NS-DIST method may avoid those issues by employing an adaptive lasso score function. It has been discussed that adaptive lasso as a regularization approach gives more stable coefficient estimates in multicollinearity scenarios than those from OLS or ordinary lasso (Tibshirani, 1996; Zou, 2006; Hebiri and Lederer, 2013), as sparsity is imposed at different levels on each neighborhood.…”
Section: Simulation Studymentioning
confidence: 99%
“…Recent work of Hebiri and Lederer (2013), Dalalyan et al (2014), and others has examined in some detail how correlations among the columns x ·j of the design matrix X affect the performance of the LASSO and the selection of the penalty parameter λ. The broad conclusion of this work is that smaller values of λ are appropriate when the columns of X are more (positively) correlated, a relationship that holds for the permutation selection procedure.…”
Section: Some Connections With Universal Thresholdsmentioning
confidence: 99%