2023
DOI: 10.1016/j.jhydrol.2023.129956
|View full text |Cite
|
Sign up to set email alerts
|

Deep transfer learning based on transformer for flood forecasting in data-sparse basins

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
9
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 38 publications
(9 citation statements)
references
References 61 publications
0
9
0
Order By: Relevance
“…In various domains, there is work leveraging deep learning for the purpose of assimilating data from monitoring points to forecast critical parameters. Xu et al [15] presented, in the form of the transfer learning framework based on transformer (TL-Transformer), an accurate prediction of flooding in data-sparse basins. El-Shafeiy et al [16] introduced and applied a pioneering technology, multivariate multiple convolutional networks with long short-term memory (MCN-LSTM), to real-time water quality monitoring.…”
Section: Related Workmentioning
confidence: 99%
“…In various domains, there is work leveraging deep learning for the purpose of assimilating data from monitoring points to forecast critical parameters. Xu et al [15] presented, in the form of the transfer learning framework based on transformer (TL-Transformer), an accurate prediction of flooding in data-sparse basins. El-Shafeiy et al [16] introduced and applied a pioneering technology, multivariate multiple convolutional networks with long short-term memory (MCN-LSTM), to real-time water quality monitoring.…”
Section: Related Workmentioning
confidence: 99%
“…However, in the context of ood forecasting, there are only a handful of studies to be found that involve Transformer architecture. Further, some of these studies show that accuracy comparison of models including Transformers differs for different datasets (Wei et al, 2023;Xu et al, 2023).…”
Section: Introductionmentioning
confidence: 99%
“…In the realm of ood forecasting, there is a scarcity of studies that incorporate Transformer architecture, representing a notable gap in the literature. Moreover, existing research indicates that the accuracy comparison of models, including Transformers, varies across different datasets (Wei et al, 2023;Xu et al, 2023).…”
Section: Introductionmentioning
confidence: 99%
“…LSTM and Transformer encoder shows similar accuracies in daily water level forecasting and predicting extremities. Although Transformer models tend to show a better performance in stream ow forecasting as in studies of(Castangia et al, 2023;Liu et al, 2022;Xu et al, 2023), lack of su cient data may cause the performance reduction for the Transformer Encoder model(Wei et al, 2023).In this case, switching the inputs between the Encoder and Decoder shows similar performance but it might be different in another case. Therefore, it is recommended to do this strategy and check the accuracy when there are input features with 2 different time steps.DeclarationsAuthor Contribution G.W.T.I.…”
mentioning
confidence: 99%