“…gini, max_depth: none, n_estimators: 150 Effectively addressing class imbalance through techniques such as weighted classes AdaBoost [? ], [120] n_estimators: 50, learning_rate: 0.5, algorithm: SAMME Addressing the issue of class imbalance by assigning greater weight to prediction errors on samples from the positive class XGBoost [113], [115], [120], [128], [196], [197] n_estimators: 300, scale_pos_weight: 1, max_depth: 4, Sampling method: uniform, eta: 0.3, booster: gbtree Provides numerous parameters that can be optimized Bagging [125] n_estimators: 20, base_estimator: decision tree Becoming more robust to outliers SVM [7], [40], [108], [120], [160], [172] Kernel: rbf, gamma: 1, C: 10 Effective in high-dimensional feature space KNN [76], [108], [120], [124], [145], [199] Algorithm: ball_tree, p:2, n_neighbors: 14 Capturing non-linear and complex patterns in data MLP [115] Optimizer: Adam, learning rate: 0.001 Identifying complex patterns that may be associated with cancer CNN [96] Learning rate: 0.01, Epochs: 100, Optimizer: Adam…”