A well-known concern regarding the usual linear regression model is multicollinearity. As the strength of the association among the independent variables increases, the squared standard error of regression estimators tends to increase, which can seriously impact power. This paper examines heteroscedastic methods for dealing with this issue when testing the hypothesis that all of the slope parameters are equal to zero via a robust ridge estimator that guards against outliers among the dependent variable. Included are results related to leverage points, meaning outliers among the independent variables. In various situations, the proposed method increases power substantially.