2018
DOI: 10.3233/mas-180446
|View full text |Cite
|
Sign up to set email alerts
|

Ridge Regression and multicollinearity: An in-depth review

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
31
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 48 publications
(31 citation statements)
references
References 2 publications
0
31
0
Order By: Relevance
“…No prior significance was assumed. Model assumptions were checked using variance inflation factors to measure collinearity among variables, 35 , 36 with variance inflation factors greater than 5 used as a cutoff for potential multicollinearity. 37 Likelihood ratio tests and the Akaike Information Criterion were used to validate the use of the full variable set by comparing against nested models.…”
Section: Methodsmentioning
confidence: 99%
“…No prior significance was assumed. Model assumptions were checked using variance inflation factors to measure collinearity among variables, 35 , 36 with variance inflation factors greater than 5 used as a cutoff for potential multicollinearity. 37 Likelihood ratio tests and the Akaike Information Criterion were used to validate the use of the full variable set by comparing against nested models.…”
Section: Methodsmentioning
confidence: 99%
“…To further combat the multicollinearity, we then tested the multiple regression model using the elastic nets method with Akaike's information selection criterion 43,44 , using as predictors of BIS the two remaining factors. This analysis yielded both factors of S_∆tHRV & ∆Midl-ϑPow and S_∆tHRV & ∆CzPz-αPow as potential predictors of BIS (F(2,59) = 9.01, p < 0.01, 2 p = 0.190; R-Square = 0.234; Glmselect procedure, SAS-9.4 45 ).…”
Section: Resultsmentioning
confidence: 99%
“…In addition, XGBoost implements a general tree-boosting algorithm. It adds Lasso (L1) [52] or Ridge (L2) [53] regulation to avoid an over-fitting, uses the second derivative information of the cost function, and introduces the idea of column sampling, as compared with GBDT. XGBoost significantly improves the efficiency and generalization of the prediction model.…”
Section: Machine-learning-based Model Fusionmentioning
confidence: 99%