2022
DOI: 10.1109/access.2022.3207287
|View full text |Cite
|
Sign up to set email alerts
|

A Survey of Ensemble Learning: Concepts, Algorithms, Applications, and Prospects

Abstract: Ensemble learning techniques have achieved state-of-the-art performance in diverse machine learning applications by combining the predictions from two or more base models. This paper presents a concise overview of ensemble learning, covering the three main ensemble methods: bagging, boosting, and stacking, their early development to the recent state-of-the-art algorithms. The study focuses on the widely used ensemble algorithms, including random forest, adaptive boosting (AdaBoost), gradient boosting, extreme … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
112
0
1

Year Published

2023
2023
2024
2024

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 358 publications
(114 citation statements)
references
References 188 publications
(193 reference statements)
1
112
0
1
Order By: Relevance
“…Unlike bagging and boosting, which uses combination rules such as majority voting and weighted majority voting, the stacking ensemble model uses another ML algorithm (i.e., meta-learner) to aggregate the predictions from the level-0 models [72]. In the literature, stacking-based ensembles have been applied in diverse fields [73]- [75].…”
Section: E Ensemble Learningmentioning
confidence: 99%
“…Unlike bagging and boosting, which uses combination rules such as majority voting and weighted majority voting, the stacking ensemble model uses another ML algorithm (i.e., meta-learner) to aggregate the predictions from the level-0 models [72]. In the literature, stacking-based ensembles have been applied in diverse fields [73]- [75].…”
Section: E Ensemble Learningmentioning
confidence: 99%
“…Having selected the training samples, training the classifier is rather straightforward. The XGB framework been selected because of its showcased competence in achieving high accuracy in different classification and regression tasks [22], including very recently in wildfire research [35] [36] [37] (in quite different tasks than our work). Perhaps most importantly, the XGBoost open-source software library [21] is extremely optimized in terms of parallel execution and memory minimization, being able to train an extremely deep set of trees using more than 60,000 training patterns (as we selected for the experiments) in a few seconds.…”
Section: Training and Applying The Xgb Classifiermentioning
confidence: 99%
“…The proposed methodology has been designed to be applicable in a nationwide operational framework, with the classification models being trained considering a database of 64 reference fire perimeters in Greece from 2016 to 2019. For the supervised classification step, the extreme Gradient Boosting (XGB) [21] machine learning framework has been selected, which the last few years showcases very high competence in achieving high accuracy in different classification and regression tasks [22]. An empirical approach for selecting representative training patterns via difference spectral indices is also proposed, which is a crucial element and guarantees the effectiveness of the whole methodology.…”
Section: Introductionmentioning
confidence: 99%
“…Authors in [26] proposed TF-IDF (Term Frequency-Inverse Document Frequency) and BM25 [27] based information extraction methods along with a ranking method inspired by [28] over LDA and NMF topic modeling. Authors in [29] proposed an ensemble learning [30] based methodology over truth discovery algorithm [31] that consists of a graph with topics, words, Wikipedia articles, and candidate terms as nodes, and relationship among them as edge. Authors in [32] applied graph extraction methods over DBpedia 1 concepts.…”
Section: B Graph-based Approachesmentioning
confidence: 99%