2020
DOI: 10.9734/ajpas/2020/v6i130151
|View full text |Cite
|
Sign up to set email alerts
|

Multicollinearity Effect in Regression Analysis: A Feed Forward Artificial Neural Network Approach

Abstract: In this study we compared the performance of Ordinary Least Squares Regression (OLSR) and the Artificial Neural Network (ANN) in the presence of multicollinearity using two datasets – a real life insurance data and a simulated data – to know which of the methods, models a highly correlated dataset better using the Root Mean Square Error (RMSE) as the performance measure. The ANN performed better than the OLSR model for all the different ANN models except the models with nine and ten nodes in the hidden layer f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 49 publications
(21 citation statements)
references
References 3 publications
0
21
0
Order By: Relevance
“… is the bias that can be interpreted as the intercept in a linear regression, A final transformation 0  is applied to the output (Obite et al, 2020), and the quadratic error function given in equation ( 5) was used in this study to determine the weights.…”
Section: Methodsmentioning
confidence: 99%
“… is the bias that can be interpreted as the intercept in a linear regression, A final transformation 0  is applied to the output (Obite et al, 2020), and the quadratic error function given in equation ( 5) was used in this study to determine the weights.…”
Section: Methodsmentioning
confidence: 99%
“…In order to achieve this, Akpanta and Iwueze's [23] method of using Bartlett's transformation [24] was used. The regression line obtained was = 1.07602 − 3.99102 (See Appendix II for full result of the regression analysis).…”
Section: Results From Sarimamentioning
confidence: 99%
“…To ensure that the seasonal artificial neural network would converge quickly [22] the input and target variables were scaled. The input and target variables were scaled using the Min-Max normalization approach thus:…”
Section: (D) Forecastingmentioning
confidence: 99%
See 1 more Smart Citation
“…Test for multi-collinearity. Multi-collinearity refers to the problem when independent variables in a multiple regression model are highly correlated with one another (Obite et al, 2020;Wonsuk et al, 2014;Zhang et al, 2020). The presence of multi-collinearity can lead to wider confidence intervals and less reliable probability values (P values) for the parameter estimates of independent variables, drastically reducing statistical significance and predictive power of the regression model (Paul, 2006;Zhou and Huang, 2018).…”
Section: Logistic Regression Modelmentioning
confidence: 99%