2021
DOI: 10.1007/s12517-021-07013-6
|View full text |Cite
|
Sign up to set email alerts
|

The XGBoost and the SVM-based prediction models for bioretention cell decontamination effect

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(2 citation statements)
references
References 51 publications
0
2
0
Order By: Relevance
“…The research of Mohsen (2021) [33] showed that SVM and XGBoost ranked first in predicting leak and nonleak samples in a laboratory-scale water distribution system. Moreover, Wang et al (2021) [34] found that XGBoost had a better generalization ability than SVM, as XGBoost could improve its prediction accuracy via the decontamination effect. Nagaraj and Lakshmi (2021) [35] reported that the XGBoost classifier outperformed the other machine learning algorithms assessed in their study in terms of water body extraction.…”
Section: Introductionmentioning
confidence: 99%
“…The research of Mohsen (2021) [33] showed that SVM and XGBoost ranked first in predicting leak and nonleak samples in a laboratory-scale water distribution system. Moreover, Wang et al (2021) [34] found that XGBoost had a better generalization ability than SVM, as XGBoost could improve its prediction accuracy via the decontamination effect. Nagaraj and Lakshmi (2021) [35] reported that the XGBoost classifier outperformed the other machine learning algorithms assessed in their study in terms of water body extraction.…”
Section: Introductionmentioning
confidence: 99%
“…XGBoost performs better than Support Vector Machines (SVM) in discriminating certain diseases in patients from healthy controls, using confusion matrix as its evaluation metric (Binson et al, 2021;Ogunleye et al, 2019). On the soil liquefaction prediction, whose data are sampled using different techniques, the study found that XGBoost perform better than Random Forests and SVM Demir et al (2022), data undergoing transformation Sahin (2023), better than random forest and gradient boosting machine on landslide data using RMSE as the evaluation metric Sahin (2020), better than logistic regression, Bayesian Additive Regression Tree (BART), random forest, and SVM on tumor classification problem Zhang et al (2023), better than SVM and K-nearest neighbour (KNN) on company bankruptcy classification problem Muslim et al (2021), and on surface water flooding data that stated XGBoost had a better generalization ability than SVM to improve prediction accuracy (Wang et al, 2021).…”
mentioning
confidence: 99%