2020 International Conference on Computer Science and Software Engineering (CSASE) 2020
DOI: 10.1109/csase48920.2020.9142066
|View full text |Cite
|
Sign up to set email alerts
|

Using Standard Error to Find the Best Robust Regression in Presence of Multicollinearity and Outliers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 15 publications
0
3
0
Order By: Relevance
“…To detect the multicollinearity, there are high correlation coefficients, and pairwise correlations among independent variables might be high. If the correlation >0.8, then there is severe multicollinearity [79,80]. No pairs of variables were over 0.80, indicating no problems of multicollinearity.…”
Section: Resultsmentioning
confidence: 99%
“…To detect the multicollinearity, there are high correlation coefficients, and pairwise correlations among independent variables might be high. If the correlation >0.8, then there is severe multicollinearity [79,80]. No pairs of variables were over 0.80, indicating no problems of multicollinearity.…”
Section: Resultsmentioning
confidence: 99%
“…The multicollinearity between the independent variables was evaluated by Spearman's rank correlation and the correlation coefficients were not too great (< 0.7) [24].…”
Section: Discussionmentioning
confidence: 99%
“…Such combination is also considered by other researchers, but they focus on other regression models. Like; for LRM: Lukman et al [21], Pati et al [22], Ibrahim and Yahya [23], Majid et al [24,25], Arum et al [26] and Lukman et al [27] considered this combination. For GLM: Arum et al [28] considered this combination of problem and presented the robust modified jackknife ridge estimator for the Poisson regression.…”
Section: Introductionmentioning
confidence: 99%