2020
DOI: 10.3390/su12062339
|View full text |Cite
|
Sign up to set email alerts
|

Extreme Learning Machine Based Prediction of Soil Shear Strength: A Sensitivity Analysis Using Monte Carlo Simulations and Feature Backward Elimination

Abstract: Machine Learning (ML) has been applied widely in solving a lot of real-world problems. However, this approach is very sensitive to the selection of input variables for modeling and simulation. In this study, the main objective is to analyze the sensitivity of an advanced ML method, namely the Extreme Learning Machine (ELM) algorithm under different feature selection scenarios for prediction of shear strength of soil. Feature backward elimination supported by Monte Carlo simulations was applied to evaluate the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
24
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
9

Relationship

4
5

Authors

Journals

citations
Cited by 48 publications
(24 citation statements)
references
References 46 publications
0
24
0
Order By: Relevance
“…Validation performance is a critical step in a modeling procedure, for which several statistical indices has been suggested and used [13,14,[49][50][51][52]. In this study, we used Area Under Receiver Operating Characteristic (ROC) curve (AUC) [39,[53][54][55][56], Root Mean Squared Error (RMSE) [57][58][59][60][61][62][63][64], Kappa, Accuracy (ACC), Specificity (SPF), Sensitivity (SST), Negative predictive value (NPV), and Positive predictive value (PPV) [65][66][67][68][69]. Detail description of these indices is presented in published literature [61,[70][71][72][73][74][75][76][77].…”
Section: Validation Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Validation performance is a critical step in a modeling procedure, for which several statistical indices has been suggested and used [13,14,[49][50][51][52]. In this study, we used Area Under Receiver Operating Characteristic (ROC) curve (AUC) [39,[53][54][55][56], Root Mean Squared Error (RMSE) [57][58][59][60][61][62][63][64], Kappa, Accuracy (ACC), Specificity (SPF), Sensitivity (SST), Negative predictive value (NPV), and Positive predictive value (PPV) [65][66][67][68][69]. Detail description of these indices is presented in published literature [61,[70][71][72][73][74][75][76][77].…”
Section: Validation Methodsmentioning
confidence: 99%
“…Detail description of these indices is presented in published literature [61,[70][71][72][73][74][75][76][77]. In general, lower RMSE and higher values of AUC, Kappa, ACC, SPF, SST, NPV, and PPV indicate higher model performance [57,58,65,[78][79][80][81][82]. Mathematically, these performance indices are given by [60,77,[83][84][85][86][87]:…”
Section: Validation Methodsmentioning
confidence: 99%
“…In this study, we used positive (TP), true negative (TN), false positive (FP), false negative (FN), positive predictive value (PPV), negative predictive value (NPV), sensitivity (SST), specificity (SPF), accuracy (ACC), root mean square error (RMSE), and Kappa for the comparison and validation of the models. The equation for each one of these indices is as follows [75][76][77][78][79][80][81][82]:…”
Section: Statistical Indicesmentioning
confidence: 99%
“…The NN model exhibits crucial profits not found in traditional computational methods. Hypotheses or constraints are not necessary when optimizing NNs [111][112][113], and they are also able to analyze and explore complex (even nonlinear) relationships in data [114][115][116]. From a computational point of view, NNs are powerful at solving high dimensional problems because of their processing capabilities in parallel [19,117,118].…”
Section: Neural Network (Nn)mentioning
confidence: 99%