2022
DOI: 10.31223/x5hm08
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Systematic Review of Deep Learning Applications in Streamflow Data Augmentation and Forecasting

Abstract: The volume and variety of Earth data have increased as a result of growing attention to climate change and, subsequently, the availability of large-scale sensor networks and remote sensing instruments. This data has been an important resource for data-driven studies to generate practical knowledge and services, support environmental modeling and forecasting needs, and transform climate and earth science research thanks to the increased availability of computational resources and the popularity of novel computa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 145 publications
0
4
0
Order By: Relevance
“…In order to add more features to the deep neural network model, a long short-term memory neural network (Hochreiter and Schmidhuber, 1997) extends the lifespan of the short-term memory. In hydrological forecasting tasks that require longer memory, such as flood forecasting and rainfall forecasting, LSTMs and their variations are frequently used (Sit and Demir, 2019;Sit et al, 2022a). A concise description of an LSTM node and its mathematical foundation can be found in Equations 4 through 9, and Figures 6 and 7.…”
Section: Long Short-term Memory (Lstm) Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…In order to add more features to the deep neural network model, a long short-term memory neural network (Hochreiter and Schmidhuber, 1997) extends the lifespan of the short-term memory. In hydrological forecasting tasks that require longer memory, such as flood forecasting and rainfall forecasting, LSTMs and their variations are frequently used (Sit and Demir, 2019;Sit et al, 2022a). A concise description of an LSTM node and its mathematical foundation can be found in Equations 4 through 9, and Figures 6 and 7.…”
Section: Long Short-term Memory (Lstm) Networkmentioning
confidence: 99%
“…Estimating streamflow for an ungauged basin or regionalizing an ungauged basin is often done by taking advantage of spatial proximity, physical similarity, regression-based methods, or hydrological signature methods. Besides these approaches, Guo et al (2021) proposed a machine learning approach for this task, albeit its utilization is quite limited in the literature (Sit et al, 2022a).…”
Section: Introductionmentioning
confidence: 99%
“…Many classical machine-learning approaches have been used in streamflow forecasting and environmental studies (Bayar et al, 2009;Li and Demir, 2023) including Support Vector Machines (SVMs) and Linear Regression (LR) (Granata et al, 2016;Yan et al, 2018;Sharma and Machiwal, 2021). However, advancements in artificial intelligence (AI) coupled with the increasing capabilities of graphics processing units (GPUs) have opened up new possibilities and accelerated the progress of deep learning techniques, which has led to the widespread usage of these techniques in streamflow forecasting as well (Sit et al, 2022a). Out of various neural network architectures explored for streamflow forecasting (Sit et al, 2021a;Xiang and Demir, 2022b;Chen et al, 2023), Recurrent Neural Networks (RNNs), especially the Long Short-Term Memory (LSTM) neural network and Gated Recurrent Units (GRUs), have emerged as the most extensively studied and researched models in this domain.…”
Section: Introductionmentioning
confidence: 99%
“…Sit et al. (2021) proposed a model based on the combination of graph convolutional and GRU architectures, following the methods developed by Seo et al. (2018), to build a model for the prediction of a 36 hr of streamflow using data coming from different sensors in a river network.…”
Section: Introductionmentioning
confidence: 99%