2018
DOI: 10.1016/j.patcog.2018.05.022
|View full text |Cite
|
Sign up to set email alerts
|

Vote-boosting ensembles

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 44 publications
(20 citation statements)
references
References 46 publications
0
18
0
Order By: Relevance
“…The standard classifiers for the heterogeneous ensemble considered in this study are described as follows: Bayes Theory (Naïve Bayes algorithm), Instance Learning (k Nearest Neighbour), Rule-based (RIPPER) and Tree methods (C4.5 Decision Tree). The voting combination method [14] [15] was adopted in this study for building the heterogeneous ensemble method. The voting method is a noncomplicated method of combining several predictions of varied or different models, and it can be implemented in a variety of approaches, including majority vote, minority vote and average of probabilities.…”
Section: A Datasetmentioning
confidence: 99%
“…The standard classifiers for the heterogeneous ensemble considered in this study are described as follows: Bayes Theory (Naïve Bayes algorithm), Instance Learning (k Nearest Neighbour), Rule-based (RIPPER) and Tree methods (C4.5 Decision Tree). The voting combination method [14] [15] was adopted in this study for building the heterogeneous ensemble method. The voting method is a noncomplicated method of combining several predictions of varied or different models, and it can be implemented in a variety of approaches, including majority vote, minority vote and average of probabilities.…”
Section: A Datasetmentioning
confidence: 99%
“…In this study, vote-boosting method is used as the aggregation rule which specifies the instance weights based on the degree of the agreement or disagreement among its assigned labels by the base classifiers in the last ensemble layer. Previous studies have shown the robustness to class-label noise ability for vote-boosting method [24].…”
Section: Feature Rankingmentioning
confidence: 97%
“…Boosting is one of the well known forms of this paradigm, AdaBoost algorithms in particular. Also, several sequential ensemble approaches have been recently proposed in the literature such as Voteboosting algorithm [29], SENF approach [30] and SEL framework [31]. On the other hand, the parallel ensemble paradigm, which is more popular and easier to implement, draws on the independence and diversity between the base learners since combining their independent decisions can reduce the classification error effectively [32].…”
Section: B Ensemble Rule-base Systemsmentioning
confidence: 99%