2020
DOI: 10.1111/insr.12381
|View full text |Cite
|
Sign up to set email alerts
|

A Geometrical Interpretation of Collinearity: A Natural Way to Justify Ridge Regression and Its Anomalies

Abstract: Summary Justifying ridge regression from a geometrical perspective is one of the main contributions of this paper. To the best of our knowledge, this question has not been treated previously. This paper shows that ridge regression is a particular case of raising procedures that provide greater flexibility by transforming the matrix X associated with the model. Thus, raising procedures, based on a geometrical idea of the vectorial space associated with the columns of matrix X, lead naturally to ridge regression… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

1
6
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(7 citation statements)
references
References 30 publications
1
6
0
Order By: Relevance
“…Thus, it will be concluded that an unique application of the raise regression does not guarantee the mitigation of the multicollinearity. Consequently, this extension complements the results presented by García et al [31] and determines, on the one hand, whether it is necessary to apply a successive raise regression (see García et al [31] for more details) and, on the other hand, the most adequate variable for raising and the most optimal value for the raising factor in order to guarantee the mitigation of the multicollinearity.…”
supporting
confidence: 68%
See 4 more Smart Citations
“…Thus, it will be concluded that an unique application of the raise regression does not guarantee the mitigation of the multicollinearity. Consequently, this extension complements the results presented by García et al [31] and determines, on the one hand, whether it is necessary to apply a successive raise regression (see García et al [31] for more details) and, on the other hand, the most adequate variable for raising and the most optimal value for the raising factor in order to guarantee the mitigation of the multicollinearity.…”
supporting
confidence: 68%
“…Thus, the selection of an adequate value for λ is essential, analogously to what occurs with the ridge factor K. A preliminary proposal about how to select the raising factor in a model with two independent standardized variables can be found in García et al [41]. Other recently published papers introduce and highlight the various advantages of raise estimators for statistical analysis: Salmerón et al [30] presented the raise regression for p = 3 standardized variables and showed that it presents better properties than the ridge regression and that the individual inference of the raised variable is not altered, García et al [31] showed that it is guaranteed that all the VIFs associated with the model in Equation (5) diminish but that it is not possible to quantify the decrease, García and Ramírez [42] presented the successive raise regression, and García et al [31] showed, among other questions, that ridge regression is a particular case of raise regression. Figure 1.…”
Section: Raise Regressionmentioning
confidence: 99%
See 3 more Smart Citations