2021
DOI: 10.22266/ijies2021.1231.19
|View full text |Cite
|
Sign up to set email alerts
|

Performance Comparison of Grid Search and Random Search Methods for Hyperparameter Tuning in Extreme Gradient Boosting Algorithm to Predict Chronic Kidney Failure

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 20 publications
(9 citation statements)
references
References 24 publications
0
9
0
Order By: Relevance
“…Evaluation of this research used the accuracy, precision, recall, and F1-score, as defined by ( 15)- (18), respectively [29]. The training and test portions of the dataset were split by a ratio of 75% to 25%, respectively.…”
Section: Performance Evaluationmentioning
confidence: 99%
See 1 more Smart Citation
“…Evaluation of this research used the accuracy, precision, recall, and F1-score, as defined by ( 15)- (18), respectively [29]. The training and test portions of the dataset were split by a ratio of 75% to 25%, respectively.…”
Section: Performance Evaluationmentioning
confidence: 99%
“…According to the test findings, XGBoost has the highest accuracy (98%). Additionally, Anggoro and Mukti [18] adjusted the XGBoost hyperparameters.…”
Section: Introductionmentioning
confidence: 99%
“…Use a different algorithm if the search space is too big. Random search is faster but does not always guarantee the most significant outcome [49].…”
Section: Hyperparameter Tuningmentioning
confidence: 99%
“…For instance, under the default parameterization, each GA generation requires the training of 100 ANNs, each trained with thousands of data records. Thus, tuning all NGSA-II hyperparameters (e.g., via grid search [60][61][62]) would require a prohibitive computational effort. Secondly, NSGA-II works as a second-order optimization procedure since it it selects the input variables that feed the ANN training algorithm (the first-order optimization method).…”
Section: Genetic Algorithmsmentioning
confidence: 99%