2020
DOI: 10.3390/w12030683
|View full text |Cite
|
Sign up to set email alerts
|

GIS Based Hybrid Computational Approaches for Flash Flood Susceptibility Assessment

Abstract: Flash floods are one of the most devastating natural hazards; they occur within a catchment (region) where the response time of the drainage basin is short. Identification of probable flash flood locations and development of accurate flash flood susceptibility maps are important for proper flash flood management of a region. With this objective, we proposed and compared several novel hybrid computational approaches of machine learning methods for flash flood susceptibility mapping, namely AdaBoostM1 based Cred… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

3
53
0
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
4
1

Relationship

5
5

Authors

Journals

citations
Cited by 153 publications
(57 citation statements)
references
References 122 publications
3
53
0
1
Order By: Relevance
“…For example, Nhu et al [36] showed that reduced error pruning tree (REPT) performed better in combination with RSS than the bagging and AdaBoost techniques for gully erosion prediction, whereas Pham et al [92] reported that the REPT model performed better with rotation forest and bagging than its combination with the RSS and multiboost for landslide prediction. Different results have also been reported for flood prediction based on the ensemble models [93,94]. From these studies, we can conclude that the machine learning and ensemble learning techniques are greatly case-and site-specific, and that their performances depend heavily on the local conditions that the training datasets are developed upon, indicating that the application of different methods in different regions should be continued to find the optimum method for each environmental setting [95].…”
Section: Discussionmentioning
confidence: 94%
“…For example, Nhu et al [36] showed that reduced error pruning tree (REPT) performed better in combination with RSS than the bagging and AdaBoost techniques for gully erosion prediction, whereas Pham et al [92] reported that the REPT model performed better with rotation forest and bagging than its combination with the RSS and multiboost for landslide prediction. Different results have also been reported for flood prediction based on the ensemble models [93,94]. From these studies, we can conclude that the machine learning and ensemble learning techniques are greatly case-and site-specific, and that their performances depend heavily on the local conditions that the training datasets are developed upon, indicating that the application of different methods in different regions should be continued to find the optimum method for each environmental setting [95].…”
Section: Discussionmentioning
confidence: 94%
“…Detail description of these indices is presented in published literature [61,[70][71][72][73][74][75][76][77]. In general, lower RMSE and higher values of AUC, Kappa, ACC, SPF, SST, NPV, and PPV indicate higher model performance [57,58,65,[78][79][80][81][82]. Mathematically, these performance indices are given by [60,77,[83][84][85][86][87]:…”
Section: Validation Methodsmentioning
confidence: 99%
“…These methods extract related patterns in historical data to predict future events [73]. Data mining methods used to predict gully erosion include logistic regression (LR) [2,30,[74][75][76][77], artificial neural network (ANN) [20,48,[78][79][80], random subspace (RS) [48,62,81], maximum entropy (ME) [82], artificial neural fuzzy system (ANFIS) [56,[83][84][85][86], support vector machine (SVM) [18,59,73], fuzzy analytical network (FAN) [37], multi-criteria decision analysis (MCDA) [87,88], evidential belief function (EBF) [88,89], classification and regression tree (CART) [90,91], random forest (RF) [39,52,[92][93][94], rotation forest (RoF) [95], weights of evidence (WofE) [96], frequency ratio (FR) [28,97], BFTree for gully headcut [81], boosted regression [24], ADTree, RF-ADTree [73,76,98], and naive B...…”
Section: Introductionmentioning
confidence: 99%