2013
DOI: 10.9780/23218045/1172013/41
|View full text |Cite
|
Sign up to set email alerts
|

Comparing Ensembles Of Decision Trees And Neural Networks For One-day-ahead Stream Flow Predict

Abstract: Ensemble learning methods have received remarkable attention in the recent years and led to considerable advancement in the performance of the regression and classification problems. Bagging and boosting are among the most popular ensemble learning techniques proposed to reduce the prediction error of learning machines. In this study, bagging and gradient boosting algorithms are incorporated into the model creation process for daily streamflow prediction. This paper compares two tree-based ensembles (bagged re… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
3
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 32 publications
(24 reference statements)
1
3
0
Order By: Relevance
“…More precisely, the employed ANNs have achieved the best results in solving of considered RPO problem by achieving up to a 6% improvement in average accuracy over the Naive Bayes classifier, up to a 12% over the Multi-Layer Perceptron classifier, and up to a 13% over state-of-the-art ANN for power forecasting. Finally, their evaluation metrics (like Precision, Sensibility, and F-Score) are slightly higher than those obtained from the J48 decision trees, as already observed in several studies where neural networks have been compared with decision trees (Tharaha and Rashika 2017;Karakurt et al 2013;Ahmad et al 2017).…”
Section: Achieved Results and Discussionsupporting
confidence: 54%
“…More precisely, the employed ANNs have achieved the best results in solving of considered RPO problem by achieving up to a 6% improvement in average accuracy over the Naive Bayes classifier, up to a 12% over the Multi-Layer Perceptron classifier, and up to a 13% over state-of-the-art ANN for power forecasting. Finally, their evaluation metrics (like Precision, Sensibility, and F-Score) are slightly higher than those obtained from the J48 decision trees, as already observed in several studies where neural networks have been compared with decision trees (Tharaha and Rashika 2017;Karakurt et al 2013;Ahmad et al 2017).…”
Section: Achieved Results and Discussionsupporting
confidence: 54%
“…However, comparative studies on the performance of various ensemble learning models have discovered that bagging outperforms boosting and stacking [16][17]. Bagging can significantly decrease the ML model's prediction error and variance when utilized with a base learner generation [17][18].…”
Section: Background and Related Workmentioning
confidence: 99%
“…Decision trees and neural network methods demonstrate good prediction and classification quality compared to such machine learning methods as Random Forest, Support Vector Machine, Naive Bayes, and K Nearest Neighbor. The paper [11] compared ensembles of decision trees and neural networks for one-day-ahead streamflow prediction. The results obtained in this study indicate that ensemble learning models yield better prediction accuracy than a conventional artificial neural network model, such as a multilayer perceptron.…”
Section: Related Workmentioning
confidence: 99%