2023
DOI: 10.3390/diagnostics13142391
|View full text |Cite
|
Sign up to set email alerts
|

Detection of Monkeypox Cases Based on Symptoms Using XGBoost and Shapley Additive Explanations Methods

Alireza Farzipour,
Roya Elmi,
Hamid Nasiri

Abstract: The monkeypox virus poses a novel public health risk that might quickly escalate into a worldwide epidemic. Machine learning (ML) has recently shown much promise in diagnosing diseases like cancer, finding tumor cells, and finding COVID-19 patients. In this study, we have created a dataset based on the data both collected and published by Global Health and used by the World Health Organization (WHO). Being entirely textual, this dataset shows the relationship between the symptoms and the monkeypox disease. The… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 13 publications
(4 citation statements)
references
References 64 publications
0
4
0
Order By: Relevance
“…where isthe predicted value for the i th instance, K is the number of weak models, f k ( x i ) isthe output of the k th weak model on the i th instance, and x i isthe feature vector for the i th instance ( Farzipour, Elmi & Nasiri, 2023 ; Nasiri, Homafar & Chelgani, 2021 ).…”
Section: Methodsmentioning
confidence: 99%
“…where isthe predicted value for the i th instance, K is the number of weak models, f k ( x i ) isthe output of the k th weak model on the i th instance, and x i isthe feature vector for the i th instance ( Farzipour, Elmi & Nasiri, 2023 ; Nasiri, Homafar & Chelgani, 2021 ).…”
Section: Methodsmentioning
confidence: 99%
“…In this research, PD detection was implemented using two classification models: the logistic classifier [ 35 ] and extreme gradient boosting (XGBoost) [ 36 ]. These two classification models have been used in various computer-assisted diagnosing applications [ 37 , 38 , 39 ].…”
Section: Materials and Methodsmentioning
confidence: 99%
“…Here, λ i represents the Lagrange coefficients. In sample space problems that cannot be linearly separated, SVM uses kernel functions to move the sample space to another space where it can be linearly separated [30,31]. These functions are the Linear Kernel function, the Polynomial Kernel function, the Radial Basis Function (RBF), the Kernel function, and the Sigmoid Kernel function.…”
Section: Support Vector Machinesmentioning
confidence: 99%