2022
DOI: 10.2139/ssrn.4229792
|View full text |Cite
|
Sign up to set email alerts
|

Deep Convolutional Architectures for Extrapolative Forecasts in Time-Dependent Flow Problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 27 publications
0
2
0
Order By: Relevance
“…12b. The downstream part of the dam is considered as dry whereas the free surface of the upstream part is considered as a random input parameter whose values are uniformly generated within its plausible variability range η up ∈ U [29,32] m. The snapshot matrix is obtained by running the numerical solver for each value of the upstream free surface, selected randomly from the generated sample set, for the whole N t = 100 simulation time steps that constitute the temporal domain (t ∈ [0, 50] s). For each parameter-time combination, a so-called high-fidelity solution is stored in a vector of dimension N x = 10 200, representing the free surface values at each node of the computational domain.…”
Section: Application To a Hypothetical Dam-break In A Rivermentioning
confidence: 99%
See 1 more Smart Citation
“…12b. The downstream part of the dam is considered as dry whereas the free surface of the upstream part is considered as a random input parameter whose values are uniformly generated within its plausible variability range η up ∈ U [29,32] m. The snapshot matrix is obtained by running the numerical solver for each value of the upstream free surface, selected randomly from the generated sample set, for the whole N t = 100 simulation time steps that constitute the temporal domain (t ∈ [0, 50] s). For each parameter-time combination, a so-called high-fidelity solution is stored in a vector of dimension N x = 10 200, representing the free surface values at each node of the computational domain.…”
Section: Application To a Hypothetical Dam-break In A Rivermentioning
confidence: 99%
“…The key idea behind the popular recurrent network, long-short term memory, (LSTM) [29] is the so-called cell state, which helps to overcome the problems associated with the vanishing/exploding phenomena that are caused by the long-term dependencies within the network. Some configurations combine CAEs with LSTMs or TCNs [19,[30][31][32][33] to provide a tool for time predictions over the latent space. This paper proposes a non-intrusive reduced order modeling (NIROM) framework for parametrized and time-dependent flow problems.…”
Section: Introductionmentioning
confidence: 99%