2019
DOI: 10.3390/app9214638
|View full text |Cite
|
Sign up to set email alerts
|

Machine-Learning-Based Classification Approaches toward Recognizing Slope Stability Failure

Abstract: In this paper, the authors investigated the applicability of combining machine-learning-based models toward slope stability assessment. To do this, several well-known machine-learning-based methods, namely multiple linear regression (MLR), multi-layer perceptron (MLP), radial basis function regression (RBFR), improved support vector machine using sequential minimal optimization algorithm (SMO-SVM), lazy k-nearest neighbor (IBK), random forest (RF), and random tree (RT), were selected to evaluate the stability … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 33 publications
(14 citation statements)
references
References 59 publications
0
14
0
Order By: Relevance
“…Having the features of the portfolio is significant since the classifications are done (for more, refer to [34]) based on these characteristics. In this section, we take into account having a portfolio of loans linking three features as follows [35]:…”
Section: Lending Portfolio and Featuresmentioning
confidence: 99%
“…Having the features of the portfolio is significant since the classifications are done (for more, refer to [34]) based on these characteristics. In this section, we take into account having a portfolio of loans linking three features as follows [35]:…”
Section: Lending Portfolio and Featuresmentioning
confidence: 99%
“…In the experiments, the Jaccard similarity threshold was 20% and 𝑤 1 , 𝑤 2 , and 𝑤 3 were For comparison, well-known classification algorithms in the literature were used. These algorithms are Naive Bayes-NB (Rish, 2001), Support Vector Machine-SVM (Yue et al, 2003), Decision Tree-DT (Kotsiantis, 2013) and IBk (Moayedi et al, 2019). A total of 20 independent experiments were carried out.…”
Section: Methodsmentioning
confidence: 99%
“…(a) From the available dataset, randomly draw a new training set (bootstrap sample) with replacement; (b) Grow a tree using the bootstrap sample by iteratively splitting the nodes until no further splits are possible or the user-defined stopping criterion is reached. In order to split the nodes at the most informative feature, we use an objective function to maximize the information gain at each split, which is defined as [23]:…”
Section: Feature Selection Using Random Forest Regression Methodsmentioning
confidence: 99%