2019
DOI: 10.1007/s11704-019-8208-z
|View full text |Cite
|
Sign up to set email alerts
|

A survey on ensemble learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
345
0
5

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 1,210 publications
(477 citation statements)
references
References 144 publications
2
345
0
5
Order By: Relevance
“…These results show that using HAF with ensemble learning is highly promising for recognizing emotions in human speech. The study findings were consistent with the literature that ensemble learning gives a better predictive performance through the fusion of information knowledge of predictions of multiple inducers [75,85]. The ability of ensemble learning algorithms to mimic the nature of human by seeking opinions from several inducers for informed decision distinguish them from inducers [73].…”
Section: Resultssupporting
confidence: 89%
See 2 more Smart Citations
“…These results show that using HAF with ensemble learning is highly promising for recognizing emotions in human speech. The study findings were consistent with the literature that ensemble learning gives a better predictive performance through the fusion of information knowledge of predictions of multiple inducers [75,85]. The ability of ensemble learning algorithms to mimic the nature of human by seeking opinions from several inducers for informed decision distinguish them from inducers [73].…”
Section: Resultssupporting
confidence: 89%
“…The other bagging ensemble learning algorithms investigated in this study were Bagging with SVM as the base learner (BSVM) and bagging with the multilayer perceptron neural network as the base learner (BMLP). The boosting ensemble algorithms investigated in this study were the gradient boosting machine (GBM), which extends boosting by combining the gradient descent optimization algorithm with boosting technique [75,82,84], and AdaBoost with CART as the base learner (ABC), which is one of the most widely used boosting algorithm to reduce sensitivity to class label noise [79,85]. AdaBoost is an iterative learning algorithm for constructing a strong classifier by enhancing weak classification algorithms and it can improve data classification ability by reducing both bias and variance through continuous learning [81].…”
Section: Classificationmentioning
confidence: 99%
See 1 more Smart Citation
“…Random Forest is an ensemble learning method that integrates decision trees through bagging [29,68]. In this paper, a Random Forest model is constructed through the machine learning platform Scikit-Learn (http://scikit-learn.org, 2018).…”
Section: Random Forest Classificationmentioning
confidence: 99%
“…These algorithms have appealed to many researchers in different fields of research such as traffic prediction [19], engine health prediction [20], and battery state of charge estimation [21], owing to their effectiveness in prediction. The output of an ensemble algorithm is achieved by merging the base learners' outputs [22]. Any kind of ML techniques such as ANN and decision tree can be considered as the base learner [23].…”
Section: Introductionmentioning
confidence: 99%