2023
DOI: 10.3390/ijerph20064977
|View full text |Cite
|
Sign up to set email alerts
|

Application of Bagging, Boosting and Stacking Ensemble and EasyEnsemble Methods for Landslide Susceptibility Mapping in the Three Gorges Reservoir Area of China

Abstract: Since the impoundment of the Three Gorges Reservoir area in 2003, the potential risks of geological disasters in the reservoir area have increased significantly, among which the hidden dangers of landslides are particularly prominent. To reduce casualties and damage, efficient and precise landslide susceptibility evaluation methods are important. Multiple ensemble models have been used to evaluate the susceptibility of the upper part of Badong County to landslides. In this study, EasyEnsemble technology was us… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 14 publications
(2 citation statements)
references
References 32 publications
0
2
0
Order By: Relevance
“…Parallel to classification, Ensemble Modeling (EM) has also emerged as a promising technique combining the pre-dictions from multiple ML models to improve overall performance [15]. Ensemble methods, such as bagging, boosting, stacking, and random forest, contribute to the field by mitigating inherent challenges [15].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Parallel to classification, Ensemble Modeling (EM) has also emerged as a promising technique combining the pre-dictions from multiple ML models to improve overall performance [15]. Ensemble methods, such as bagging, boosting, stacking, and random forest, contribute to the field by mitigating inherent challenges [15].…”
Section: Introductionmentioning
confidence: 99%
“…Parallel to classification, Ensemble Modeling (EM) has also emerged as a promising technique combining the pre-dictions from multiple ML models to improve overall performance [15]. Ensemble methods, such as bagging, boosting, stacking, and random forest, contribute to the field by mitigating inherent challenges [15]. By aggregating the predictions of multiple base classifiers, ensemble techniques reduce the risks of overfitting, underfitting, and biases that can affect individual classifiers.…”
Section: Introductionmentioning
confidence: 99%