2020
DOI: 10.48550/arxiv.2004.10928
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Recurrent Neural Network Architecture Search for Geophysical Emulation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
11
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 9 publications
(11 citation statements)
references
References 0 publications
0
11
0
Order By: Relevance
“…The three gates are called: forget gate f g , input gate i g and output gate o g , which operate the information flow by erasing, writing and reading, respectively. Therefore, LSTM models memorize information at different intervals and are suitable to predict time series with a certain duration interval [30,31].…”
Section: Long Short Term Memorymentioning
confidence: 99%
“…The three gates are called: forget gate f g , input gate i g and output gate o g , which operate the information flow by erasing, writing and reading, respectively. Therefore, LSTM models memorize information at different intervals and are suitable to predict time series with a certain duration interval [30,31].…”
Section: Long Short Term Memorymentioning
confidence: 99%
“…This is because solving PDE forward-models for such systems may require very fine spatiotemporal numerical discretizations which cause a significant bottleneck in design and forecast tasks [44]. The prospect of bypassing traditional numerical methods and building surrogates from data alone [26,36,37,41] is attractive for multiple applications ranging from engineering design [4,38] and control [35,23] to climate modeling [8,9,28]. This is because data-driven ROMs allow for rapid predictions of nonlinear dynamics unencumbered by the stability-based arXiv:2007.12167v1 [physics.comp-ph] 23 Jul 2020 limitations of numerical discretizations.…”
Section: Introductionmentioning
confidence: 99%
“…[33] introduced a multi-agent reinforcement learning framework as an automated discovery tool for turbulence models and applied it to forced homogeneous isotropic turbulence. Besides turbulence closure modeling, deep learning has been proved to be very successful for challenging problems such as super-resolution of turbulent flows [34][35][36], data-driven modeling of chaotic systems [37][38][39], reduced order modeling of high-dimensional multiphysics systems [40][41][42][43], and developing forecast models for complex physical systems [44][45][46][47].…”
Section: Introductionmentioning
confidence: 99%