“…Looking at the previous studies using ML algorithms, it is seen that some studies produced LSMs with a single ML algorithm [14][15][16][17], while some studies produced LSMs using multiple ML algorithms together [18][19][20] and compared their performances. In these studies, algorithms such as Support Vector Machines (SVM) [21,22], K-Nearest Neighbor (KNN) [23,24], Naïve Bayes (NB) [25,26], Artificial Neural Network (ANN) [27,28], Multilayer Perceptron (MLP) [7,29], Classification and Regression Tree (CART) [30,31], Random Forest (RF) [16,32], Adaptive Boosting (AdaBoost) [33,34], Gradient Boosting Machine (GBM) [28,35], Light Gradient Boosting Machine (LightGBM) [36,37], Natural Gradient Boosting (NGBoost) [3], Extreme Gradient Boosting (XGBoost) [17] and categorical boosting (CatBoost) [36,38] are frequently used. Algorithms such as RF, GBM, LightGBM, AdaBoost, NGBoost, CatBoost, and XGBoost that use ensemble methods such as bagging, stacking, or boosting are called tree-based ensemble learning algorithms [36,39].…”