2020
DOI: 10.1029/2019wr024908
|View full text |Cite
|
Sign up to set email alerts
|

The Utility of Information Flow in Formulating Discharge Forecast Models: A Case Study From an Arid Snow‐Dominated Catchment

Abstract: Streamflow forecasts often perform poorly because of improper representation of hydrologic response timescales in underlying models. Here, we use transfer entropy (TE), which measures information flow between variables, to identify dominant drivers of discharge and their timescales using sensor data from the Dry Creek Experimental Watershed, ID, USA. Consistent with previous mechanistic studies, TE revealed that snowpack accumulation and partitioning into melt, recharge, and evaporative loss dominated discharg… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
30
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 37 publications
(30 citation statements)
references
References 82 publications
0
30
0
Order By: Relevance
“…However, too large sequence lengths increase the model complexity and training time that can in turn reduce the performance (Duan et al, 2020). Some previous studies have set the sequence length to an arbitrary number (Zhang et al, 2018a;Le et al, 2019;Tennant et al, 2020), whereas some other studies have reported that the sequence length affects the model performance (Fan et al, 2020;Meyal et al, 2020;Xiang et al, 2020). We therefore investigated the LSTM sensitivity to the sequence length.…”
Section: Lstm Model Sensitivity To Input Features and Sequence Lengthmentioning
confidence: 99%
“…However, too large sequence lengths increase the model complexity and training time that can in turn reduce the performance (Duan et al, 2020). Some previous studies have set the sequence length to an arbitrary number (Zhang et al, 2018a;Le et al, 2019;Tennant et al, 2020), whereas some other studies have reported that the sequence length affects the model performance (Fan et al, 2020;Meyal et al, 2020;Xiang et al, 2020). We therefore investigated the LSTM sensitivity to the sequence length.…”
Section: Lstm Model Sensitivity To Input Features and Sequence Lengthmentioning
confidence: 99%
“…Compared with traditional machine learning methods, deep neural networks have more advanced architectures and a larger number of neurons (Shen, 2018). Many studies have shown the advantages of deep learning, for example, Long Short‐Term Memory networks (LSTM; Hochreiter & Schmidhuber, 1997) that significantly improve the predictive accuracy of streamflow, water level depth, soil moisture, and so on (e.g., Fang et al., 2017; Kratzert et al., 2018; Shi et al., 2015; Tennant et al., 2020; Xiang et al., 2020; X. Zhang et al., 2018). Despite these advances, interpretability remains a perceived weakness of deep learning approaches (Reichstein et al., 2019), though some studies have attempted to understand the outputs of “black box” studies with a class of Artificial Intelligence (AI) neuroscience tools which vary the inputs of AI to see the changes of outputs (Voosen, 2017).…”
Section: Introductionmentioning
confidence: 99%
“…In recent years, owing to the breakthrough in algorithms and computational conditions, deep neural networks (DNNs, also referred to as deep learning methods) based on ANNs have received a great deal of attention (LeCun et al 2015). The long short-term memory neural network (LSTM) is one of the most common applications for DNNs (Kao et al 2020;Tennant et al 2020), continuously storing antecedent useful information via the memory cell structure. It has been widely used for various fields, including flood forecasting (Hu et al 2018;Le et al 2019;Ding et al 2020;Lin et al 2020;Tennant et al 2020), with multiple hidden layers and techniques such as minibatch training and dropout method (LeCun et al 2015;Xiang et al 2020).…”
Section: Introductionmentioning
confidence: 99%
“…The long short-term memory neural network (LSTM) is one of the most common applications for DNNs (Kao et al 2020;Tennant et al 2020), continuously storing antecedent useful information via the memory cell structure. It has been widely used for various fields, including flood forecasting (Hu et al 2018;Le et al 2019;Ding et al 2020;Lin et al 2020;Tennant et al 2020), with multiple hidden layers and techniques such as minibatch training and dropout method (LeCun et al 2015;Xiang et al 2020). The single-output LSTM neural network has been found to conduct satisfying hydrographs and handle potential noise in the series (Nourani et al 2014;Zhang et al 2016;Hu et al 2018;Kratzert et al 2018).…”
Section: Introductionmentioning
confidence: 99%