2022
DOI: 10.1007/s11069-022-05363-2
|View full text |Cite
|
Sign up to set email alerts
|

Deep insight into daily runoff forecasting based on a CNN-LSTM model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 40 publications
(7 citation statements)
references
References 40 publications
0
4
0
Order By: Relevance
“…This superiority underscores the capacity of deep learning to extract intricate patterns and features from complex data, enabling more accurate anomaly detection in network traffic.  LSTM demonstrates strong performance in capturing temporal dependencies within the data, thereby boosting detection accuracy [16]. By incorporating recurrent connections, LSTM effectively learns and remembers long-range dependencies in sequential data, which is particularly beneficial for detecting anomalous patterns evolving over time in network traffic.…”
Section: ) Discussionmentioning
confidence: 99%
“…This superiority underscores the capacity of deep learning to extract intricate patterns and features from complex data, enabling more accurate anomaly detection in network traffic.  LSTM demonstrates strong performance in capturing temporal dependencies within the data, thereby boosting detection accuracy [16]. By incorporating recurrent connections, LSTM effectively learns and remembers long-range dependencies in sequential data, which is particularly beneficial for detecting anomalous patterns evolving over time in network traffic.…”
Section: ) Discussionmentioning
confidence: 99%
“…For Model 1, Model 4, and Model 7, the numbers of samples required for building the models were 42, 155, and 37, respectively. The differences in model performance between the training and testing datasets can be considered as indicators as to whether the model had been overfitted, which is a common phenomenon when using machine learning models [44,45]. The results show that Model 4 performed more consistently between the training and testing dataset than Model 1 and Model 7, implying that a larger number of samples will lead to a decrease in the degree of overfitting.…”
Section: Direction To Future Studies For Applying Machine Learning To...mentioning
confidence: 96%
“…Recognizing their complementary modeling capacities over hierarchical local feature extraction (CNN) and selective memorization of longer temporal patterns (LSTM), integrated CNN-LSTM architectures have shown great promise for financial time series analysis [32] [33]. Typically, contracted CNN representations of local variable-wise patterns feed into subsequent LSTMs to assimilate both short and long-range historical contexts.…”
Section: Deep Neural Networkmentioning
confidence: 99%
“…Nelson et al [37] proposed a character-level language model with event-based trading. [32] [33] evaluated combinations of CNN and LSTMs showing improvements over individual models. However, these deep neural models often face challenges with longer-term dependencies in sequential data due to vanishing gradient issues.…”
Section: Deep Neural Networkmentioning
confidence: 99%