“…To optimize the performance of the XGBoost models, the Bayesian optimization approach was employed using Optuna 34 . Following hyperparameter were examined for the designated ranges in the Bayesian optimization: learning_rate, 10 -4 to 10 -1 , max_depth, 3 to 10, min_child_weight, 10 -3 to 10 2 ), subsample, 0.1 to 1, colsample_bytree, 0.1 to 1, reg_alpha, 10 -6 to 10 2 , reg_lambda, 10 -6 to 10 2 , and n_estimators, 50 to 300. During the Bayesian optimization process, an early stopping criterion was set to 20 rounds.…”