2010
DOI: 10.1007/978-3-642-15387-7_15
|View full text |Cite
|
Sign up to set email alerts
|

Nonparametric Statistical Analysis of Machine Learning Algorithms for Regression Problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
19
0
3

Year Published

2011
2011
2019
2019

Publication Types

Select...
4
3
2

Relationship

1
8

Authors

Journals

citations
Cited by 24 publications
(24 citation statements)
references
References 20 publications
2
19
0
3
Order By: Relevance
“…In our case study we applied 6 regression neural algorithms to 29 benchmark data sets and performed a statistical analysis of the results obtained using nonparametric tests and post-hoc procedures designed especially for multiple comparisons. Our preliminary work was presented at the KES2010 conference (Graczyk et al, 2010).…”
Section: B Trawiński Et Almentioning
confidence: 99%
“…In our case study we applied 6 regression neural algorithms to 29 benchmark data sets and performed a statistical analysis of the results obtained using nonparametric tests and post-hoc procedures designed especially for multiple comparisons. Our preliminary work was presented at the KES2010 conference (Graczyk et al, 2010).…”
Section: B Trawiński Et Almentioning
confidence: 99%
“…independence, normality, and heteroscedasticity. This is not the case in the majority of experiments in machine learning [37]. Thus, we investigated the statistical significance of the differences on performance using the nonparametric Wilcoxon test; we kept the result of the AUC measure for each fold and each classifier, and then compared them using Wilcoxon [17].…”
Section: Methodsmentioning
confidence: 99%
“…Hence, the modified kernel regression for function approximation is proposed to enhance and improve the original NadarayaWatson kernel regression. Whereas the existing techniques to solve small samples rely on the ANNBP, a non-deterministic prediction model [19] that tends to produce inconsistent predictions, the proposed model produces consistent predictions due to the convexity of the weight estimation during training. The proposed model also does not require an artificial sample generation method to be incorporated in the model development.…”
Section: Introductionmentioning
confidence: 96%