2016
DOI: 10.1016/j.neucom.2015.08.118
|View full text |Cite
|
Sign up to set email alerts
|

Displacement prediction of landslide based on generalized regression neural networks with K-fold cross-validation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
45
0
2

Year Published

2018
2018
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 138 publications
(47 citation statements)
references
References 27 publications
0
45
0
2
Order By: Relevance
“…Model tuning. For all ML algorithms, k-fold cross-validation method is performed to determine the optimal model settings and evaluating the generalized model performance to an independent data set 47,48 . The k-fold cross validation also helps to avoid overfitting.…”
Section: Ensemble Learningmentioning
confidence: 99%
“…Model tuning. For all ML algorithms, k-fold cross-validation method is performed to determine the optimal model settings and evaluating the generalized model performance to an independent data set 47,48 . The k-fold cross validation also helps to avoid overfitting.…”
Section: Ensemble Learningmentioning
confidence: 99%
“…This process is repeated K times, and each K subsample is used exactly once as the validation data. The K results from the folds can then be averaged (or otherwise combined) to produce a single estimation (Jiang & Chen, 2016). This strategy was used for SVM validation using K = 10 and the mean accuracy was considered as the final accuracy for SVM.…”
Section: Methodsmentioning
confidence: 99%
“…A small number of hidden nodes may cause under-fitting, while larger numbers favor increasing accuracy but with a larger risk of over-fitting. Instead of a trial and error approach to optimize this tradeoff, the k-fold cross-validation (CV) is pursued here for selecting an optimal number of neurons [107,108]. The CV approach sequentially partitions the…”
Section: Fundingmentioning
confidence: 99%