In the multiple regression analysis, most frequently occurring problems are the presence of multicollinearity and outliers. They produce undesirable effects on the least squares estimates of regression parameters. The Jackknifed Ridge Regression estimator and M-estimator have been proposed to overcome multicollinearity and outliers respectively. The Jackknifed Ridge Regression estimator is obtained by shrinking the Ordinary Least Squares estimator. Since the Ordinary Least Squares estimator is sensitive to outliers, the Jackknife Ridge Regression estimator is also sensitive to outliers. To overcome the combined problem of multicollinearity and outliers, we propose a new estimator namely, Jackknifed Ridge M-estimator. This estimator is obtained by shrinking an M-estimator instead of the Ordinary Least Squares estimator.
In the multiple linear regression, multicollinearity and outliers are commonly occurring problems. They produce undesirable effects on the ordinary least squares estimator. Many alternative parameter estimation methods are available in the literature which deals with these problems independently. In practice, it may happen that the multicollinearity and outliers occur simultaneously. In this article, we present a new estimator called as Linearized Ridge Mestimator which combats the problem of simultaneous occurrence of multicollinearity and outliers. A real data example and a simulation study is carried out to illustrate the performance of the proposed estimator.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.