2021
DOI: 10.1007/s10489-021-02447-7
|View full text |Cite
|
Sign up to set email alerts
|

A semi-hard voting combiner scheme to ensemble multi-class probabilistic classifiers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 23 publications
0
6
0
Order By: Relevance
“…Due to data dependency, distinct learners learn in succession using the sequential ensemble technique. [20] As a result, as seen in Fig. 4, the second model successively corrects the faults produced by the first model.…”
Section: Training Baseline Classifiersmentioning
confidence: 76%
“…Due to data dependency, distinct learners learn in succession using the sequential ensemble technique. [20] As a result, as seen in Fig. 4, the second model successively corrects the faults produced by the first model.…”
Section: Training Baseline Classifiersmentioning
confidence: 76%
“…As such, we created our proposed model by assigning different weights to these classifiers and feeding their predictions to an integrated model, which is also known as ensemble learning. The voting algorithm is one of the simplest, most popular, and effective combiner schemes for ensemble learning [ 45 , 46 ]. It fuses the results from various learning algorithms to achieve knowledge discovery and better predictive performance [ 47 , 48 , 49 ].…”
Section: Methodsmentioning
confidence: 99%
“…Delgado [118] proposed, based on the confidence level CL that assigns a degree of support of each weak learner and bagging approach, a voting scheme. The degree of support measure depends on the probabilistic of the error of each individual classifier.…”
Section: Related Workmentioning
confidence: 99%