2021
DOI: 10.1007/s12594-021-1785-0
|View full text |Cite
|
Sign up to set email alerts
|

Assessment of Flood Frequency using Statistical and Hybrid Neural Network Method: Mahanadi River Basin, India

Abstract: Flooding is the most common and widespread natural hazard affecting societies around the globe. In this context, forecasting of peak flood discharge is necessary for planning, designing and managing hydraulic structures and is crucial for decision makers to mitigate flooding risks. This study investigates potential of four most frequently used traditional statistical distribution techniques and three neural network algorithms for flood forecasting. Four statistical methods includes Generalized Extreme Value (G… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
7
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 37 publications
(9 citation statements)
references
References 36 publications
0
7
0
Order By: Relevance
“…Recorded sample data sets pose a critical input in the probability estimation, whereas standard norms can be considered 30 years; on the contrary, a lack of temporal data series leads to poor frequency results (Bobee & Robitaille, 1977; Saghafian et al, 2014). However, many studies around the world have been carried out with the probability methods to understand the flood hazard, especially in the data‐scarce area (ungauged stations) or with at least 10 years or above data range (Odunuga & Raji, 2014; Samantaray et al, 2021). Different probability distribution approaches like Gumbel, Log‐Pearson, and Log‐Normal are widely applicable or suitable for frequency analysis at a particular‐site (Hydrological station) or regional level frequency analysis (UI Hassan et al, 2019).…”
Section: Data Sources and Methodologymentioning
confidence: 99%
“…Recorded sample data sets pose a critical input in the probability estimation, whereas standard norms can be considered 30 years; on the contrary, a lack of temporal data series leads to poor frequency results (Bobee & Robitaille, 1977; Saghafian et al, 2014). However, many studies around the world have been carried out with the probability methods to understand the flood hazard, especially in the data‐scarce area (ungauged stations) or with at least 10 years or above data range (Odunuga & Raji, 2014; Samantaray et al, 2021). Different probability distribution approaches like Gumbel, Log‐Pearson, and Log‐Normal are widely applicable or suitable for frequency analysis at a particular‐site (Hydrological station) or regional level frequency analysis (UI Hassan et al, 2019).…”
Section: Data Sources and Methodologymentioning
confidence: 99%
“…In 1032 (2022) 012016 IOP Publishing doi:10.1088/1755-1315/1032/1/012016 2 contrast, DD models mathematically capture non-linear or linear interactions amid Qflow and its descriptive variables [8]. DD methods like ANN (artificial neural network), SVM, and ANFIS (adaptive neuro-fuzzy inference system), have been successfully applied by many investigators in hydrological field of study; for example, sediment transport modelling [9,10,11,12], runoff modeling [13,14,15], water table depth prediction [16,17,18], streamflow forecasting [19,20,21]; pan evaporation estimation [22,23]; flood prediction [24,25,26,27,28].…”
Section: Introductionmentioning
confidence: 99%
“…FFA can be easily done with the help of many statistical distribution functions, including Normal, Log-Normal, Gumbel, Pearson Type III, Log-Pearson Type III, Weibull, Generalized Extreme Value, and Generalized Logistic functions. There are many studies in the literature in which these distributions are used (Zhang et al, 2017;Farooq et al, 2018;Bhat et al, 2019;Langat et al, 2019;Samantaray and Sahoo, 2021;Sahoo and Ghose, 2021;Umar et al, 2021;Mangukiya et al, 2022).…”
Section: Introductionmentioning
confidence: 99%