“…We conducted a detailed grid search combined with fivefold cross-validation, as detailed in Table 6. This approach specifically evaluated various hyperparameters of the ANN with a Multilayer Perceptron structure as follows: architectures including configurations denoted as tuples in the hidden layers (e.g., (m,n) means that the ANN has two hidden layers with m neurons in the first layer and n neurons in the second layer [49]) with values of (5,2), (10,5), (15,7), (20,10), (10,5,2), (15,7,3), and (20,10,5); learning rates of 0.0001, 0.001, 0.01, 0.05, and 0.1; and learning rate update policies such as 'constant,' 'adaptive,' and 'invscaling.' We used the Rectified Linear Unit (RELU) as the activation function due to its proven effectiveness in regression.…”