2019
DOI: 10.3390/make1010021
|View full text |Cite
|
Sign up to set email alerts
|

High-Dimensional LASSO-Based Computational Regression Models: Regularization, Shrinkage, and Selection

Abstract: Regression models are a form of supervised learning methods that are important for machine learning, statistics, and general data science. Despite the fact that classical ordinary least squares (OLS) regression models have been known for a long time, in recent years there are many new developments that extend this model significantly. Above all, the least absolute shrinkage and selection operator (LASSO) model gained considerable interest. In this paper, we review general regression models with a focus on the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
69
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 110 publications
(70 citation statements)
references
References 74 publications
0
69
1
Order By: Relevance
“…Regarding the disadvantages of LASSO, it can select a limited number of features and often only one feature per group of features. In addition, for low-dimensional cases, model interpretability is low [52][53][54]. Finally, this study only focused on in-hospital mortality, and future studies will examine the nomogram over the long term.…”
Section: Discussionmentioning
confidence: 99%
“…Regarding the disadvantages of LASSO, it can select a limited number of features and often only one feature per group of features. In addition, for low-dimensional cases, model interpretability is low [52][53][54]. Finally, this study only focused on in-hospital mortality, and future studies will examine the nomogram over the long term.…”
Section: Discussionmentioning
confidence: 99%
“…Additionally, the relationship between the selected variables and e-cigarette use may not be adequately explained by a multivariable logistic regression model. Other limitations of the ML algorithms include the fact that Boruta is computationally expensive, especially for large datasets, and LASSO has no grouping property, and as such, tends to select only one variable from a group of highly correlated variables [ 54 , 81 ].…”
Section: Discussionmentioning
confidence: 99%
“…Overall, the advantage of a ridge regression and general regularized regression model is that regularization can reduce the variance by increasing the bias. Interestingly, this can improve the prediction accuracy of a model [19].…”
Section: Regularization: Ridge Regressionmentioning
confidence: 99%
“…This definition is based on SSR and SST in Equations (17) and (19). The COD is a measure of how well the model explains the variance of the response variables.…”
Section: R 2 and Adjusted Rmentioning
confidence: 99%
See 1 more Smart Citation