“…For Bayesian optimization process, the number of iterations was set as 100 (n_iter = 100) and the steps of random exploration was set as 15 (init_points = 15). The ranges of the hyperparameters in the LightGBM for Bayesian optimization were set as follows: num_leaves (24,45), feature_fraction (0.1, 0.9), bagging_fraction (0.8, 1), max_depth (5, 8.99), lambda_l1 (0, 5), lambda_l2 (0, 3), min_split_gain (0.001, 0.1), min_child_weight (5,50). After the parameter optimization process, the final used value of the parameters was as follows: num_leaves = 45, min_child_weight = 6.163, learning_rate = 0.01, bagging_fraction = 0.870, feature_fraction = 0.632, lambda_l1 = 0.…”