2021
DOI: 10.1007/s13042-021-01442-1
|View full text |Cite|
|
Sign up to set email alerts
|

Building heterogeneous ensembles by pooling homogeneous ensembles

Abstract: Heterogeneous ensembles consist of predictors of different types, which are likely to have different biases. If these biases are complementary, the combination of their decisions is beneficial and could be superior to homogeneous ensembles. In this paper, a family of heterogeneous ensembles is built by pooling classifiers from M homogeneous ensembles of different types of size T. Depending on the fraction of base classifiers of each type, a particular heterogeneous combination in this family is represented by … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 19 publications
(10 citation statements)
references
References 26 publications
0
10
0
Order By: Relevance
“…Furthermore, parallel ensembles can be classified into homogeneous or heterogeneous, depending on the base learners' homogeneity. Homogeneous ensembles consist of models built using the same ML algorithm, while heterogeneous ensembles comprise models from different algorithms [46]- [48].…”
Section: Overview Of Ensemble Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…Furthermore, parallel ensembles can be classified into homogeneous or heterogeneous, depending on the base learners' homogeneity. Homogeneous ensembles consist of models built using the same ML algorithm, while heterogeneous ensembles comprise models from different algorithms [46]- [48].…”
Section: Overview Of Ensemble Learningmentioning
confidence: 99%
“…For example, heterogeneous ensembles employ different ML algorithms as base learners; therefore, they are essentially diverse. The main challenge in heterogeneous ensembles is obtaining the most effective method to combine the different base learners' predictions [46]. However, the main challenge of homogeneous ensemble methods is ensuring the base learners are diverse even though they use the same ML algorithm.…”
Section: Overview Of Ensemble Learningmentioning
confidence: 99%
“…Zhou (2009) described the benefits of the combination of both fields, particularly, they find that classifier combination can mitigate the instability caused by training with unlabeled data (Vries & Theirens, 2021). Notably, combined homogeneous-heterogeneous ensembles have demonstrated a better performance than the benchmark of solely one or the other (Sabzevari et al, 2022). In order to overcome the stochastic nature of individual classifiers, we incorporate homogeneous ensembles into our framework.…”
Section: Discussionmentioning
confidence: 99%
“…The combination of biased decisions could be superior to homogeneous ensembles. If these biases are complementary [ 16 ]. In most cases, Ensemble learning methods can be in the form of three popular ones, namely bagging [ 17 ], boosting [ 18 ], and stacking [ 19 ].…”
Section: Introductionmentioning
confidence: 99%