“…Moreover, hyperparameter optimisation, which is one of the steps of the model development stage, is an important part of achieving a more accurate and updatable model. Regarding the studies using the steel plate fault dataset, the following ML models, namely: logistic regression (LR) (Fakhr and Elsayad, 2012;Simić et al, 2014;Kharal, 2020;Gamal et al, 2021), support vector machine (SVM) (Simić et al, 2014;Tian et al, 2015;Nkonyana et al, 2019;Srivastava, 2019;Gamal et al, 2021;Tasar, 2022), k-nearest neighbour (kNN) (Srivastava, 2019;Gamal et al, 2021;Tasar, 2022), naive Bayes (NB) (Kazemi et al, 2018;Gamal et al, 2021), decision tree (DT) (Fakhr and Elsayad, 2012;Chen, 2018;Kazemi et al, 2018;Srivastava, 2019;Gamal et al, 2021;Tasar, 2022), random forest (RF) (Chen, 2018;Nkonyana et al, 2019;Srivastava, 2019;Kharal, 2020;Gamal et al, 2021;Tasar, 2022), neural network (NN) (Fakhr and Elsayad, 2012;Simić et al, 2014;Zhao et al, 2015;Kazemi et al, 2018;Nkonyana et al, 2019;Gamal et al, 2021;Tasar, 2022) have developed to address the fault classification problem. However, among all these ML models, studies involving hyperparameter optimization are rarely addressed (Tian et al, 2015;Zhao et al, 2015;Nkonyana et al, 2019;…”