2015
DOI: 10.1080/03610918.2015.1096378
|View full text |Cite
|
Sign up to set email alerts
|

Ridge Regression and Generalized Maximum Entropy: An improved version of the Ridge–GME parameter estimator

Abstract: In this paper, the Ridge-GME parameter estimator, which combines Ridge Regression and Generalized Maximum Entropy, is improved in order to eliminate the subjectivity in the analysis of the ridge trace. A serious concern with the visual inspection of the ridge trace to define the supports for the parameters in the Ridge-GME parameter estimator is the misinterpretation of some ridge traces, in particular where some of them are very close to the axes. A simulation study and two empirical applications are used to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 35 publications
0
8
0
Order By: Relevance
“…4. Although the coefficient of variable OI is not significantly different from zero in any case, the not expected negative sign obtained in model in Equation 15is corrected in models Equations (17) and (18). 5.…”
Section: Raised Limmentioning
confidence: 58%
See 2 more Smart Citations
“…4. Although the coefficient of variable OI is not significantly different from zero in any case, the not expected negative sign obtained in model in Equation 15is corrected in models Equations (17) and (18). 5.…”
Section: Raised Limmentioning
confidence: 58%
“…In the words of Tutz and Ulbricht [13] "the elastic net catches all the big fish", meaning that it selects the whole group. From a different point of view, other authors have also presented different techniques and methods well suited for dealing with the collinearity problems: continuum regression ( [14]), least angle regression ( [15]), generalized maximum entropy ( [16][17][18]), the principal component analysis (PCA) regression ( [19,20]), the principal correlation components estimator ( [21]), penalized splines ( [22]), partial least squares (PLS) regression ( [23,24]), or the surrogate estimator focused on the solution of the normal equations presented by Jensen and Ramirez [25].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…They have found that when the sample size increases, the Prediction Sum of Squares (PRESS) value decreases as the correlation coefficient becomes large. Macedo [33] has improved the ridge-GME parameter estimator, which combines ridge regression and generalized maximum entropy to eliminate the subjectivity in the analysis of the ridge trace. Lukman et al [34] classify the estimators based on Dorugade [14] into different forms.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Usually, the last option is the only one applicable, and so alternative estimation methods must be applied. Macedo () proposed a list of methods with which to estimate when the degree of multicollinearity is severe, including ridge regression, principal component regression, partial least squares regression, continuum regression, lasso, elastic net, least angle regression, and generalised maximum entropy. However, in this paper, we apply orthogonal regression, a methodology that was introduced by Novales, Salmerón, García, García & López () for the regression model with three independent variables and which was later expanded by Salmerón, García, García, and García () for four variables.…”
Section: Logistic Regressionmentioning
confidence: 99%