2021
DOI: 10.1038/s41598-021-94347-6
|View full text |Cite
|
Sign up to set email alerts
|

Enhancing the weighted voting ensemble algorithm for tuberculosis predictive diagnosis

Abstract: Tuberculosis has the most considerable death rate among diseases caused by a single micro-organism type. The disease is a significant issue for most third-world countries due to poor diagnosis and treatment potentials. Early diagnosis of tuberculosis is the most effective way of managing the disease in patients to reduce the mortality rate of the infection. Despite several methods that exist in diagnosing tuberculosis, the limitations ranging from the cost in carrying out the test to the time taken to obtain t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
41
0
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 62 publications
(42 citation statements)
references
References 25 publications
0
41
0
1
Order By: Relevance
“…The conventional majority voting assumes that all the base models are equally skilled and their predictions are treated equally when calculating the final ensemble prediction. However, the weighted majority voting assigns a specific weight to the base classifiers, which is then multiplied by the models' output when computing the final ensemble prediction [56]. Assuming the generalization ability of each base model is known, then a weight W t can be assigned to classifier h t according to its estimated generalization ability.…”
Section: A Combining Base Learnersmentioning
confidence: 99%
“…The conventional majority voting assumes that all the base models are equally skilled and their predictions are treated equally when calculating the final ensemble prediction. However, the weighted majority voting assigns a specific weight to the base classifiers, which is then multiplied by the models' output when computing the final ensemble prediction [56]. Assuming the generalization ability of each base model is known, then a weight W t can be assigned to classifier h t according to its estimated generalization ability.…”
Section: A Combining Base Learnersmentioning
confidence: 99%
“…The weight of each classifier in this study would find by the classifier's performance accuracy on the training set. The existing technique uses two single classifiers like SVM and Naïve Bayes 30 . Based on particular criteria, the weighted voting approach allocates varying weights to the classifiers and then votes the classifiers based on the weights.…”
Section: Methodsmentioning
confidence: 99%
“…The existing technique uses two single classifiers like SVM and Naïve Bayes. 30 Based on particular criteria, the weighted voting approach allocates varying weights to the classifiers and then votes the classifiers based on the weights.…”
Section: Ewmv: Enhanced Weighted Majority Votingmentioning
confidence: 99%
“…Bagging is used to reduce variance within noisy datasets. Classifier Bagging is an ensemble technique [ 20 ] used for classifying test datasets from trial participants by introducing randomization into its construction procedure and then making an ensemble out of it. Bagging classifier is an ensemble meta-estimator that fits base classifiers each on random subsets of the original dataset and then aggregate their individual predictions either by voting or by averaging to form a final prediction.…”
Section: Proposed Monitoring Frameworkmentioning
confidence: 99%