2020
DOI: 10.5194/npg-27-373-2020
|View full text |Cite
|
Sign up to set email alerts
|

Data-driven predictions of a multiscale Lorenz 96 chaotic system using machine-learning methods: reservoir computing, artificial neural network, and long short-term memory network

Abstract: Abstract. In this paper, the performance of three machine-learning methods for predicting short-term evolution and for reproducing the long-term statistics of a multiscale spatiotemporal Lorenz 96 system is examined. The methods are an echo state network (ESN, which is a type of reservoir computing; hereafter RC–ESN), a deep feed-forward artificial neural network (ANN), and a recurrent neural network (RNN) with long short-term memory (LSTM; hereafter RNN–LSTM). This Lorenz 96 system has three tiers of … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
94
1
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 161 publications
(99 citation statements)
references
References 74 publications
3
94
1
1
Order By: Relevance
“…ML methods (in this case, recurrent neural networks) have also shown themselves capable of reproducing a time series from canonical chaotic systems with predictability beyond what dynamical systems theory would suggest, e.g. [ 55 ] (which indeed explicitly makes a claim to be ‘model-free’), [ 56 ]. Does this mean we have come full circle on the von Neumann revolution, and return to forecasting from pattern recognition rather than physics?…”
Section: Learning Physics From Datamentioning
confidence: 99%
“…ML methods (in this case, recurrent neural networks) have also shown themselves capable of reproducing a time series from canonical chaotic systems with predictability beyond what dynamical systems theory would suggest, e.g. [ 55 ] (which indeed explicitly makes a claim to be ‘model-free’), [ 56 ]. Does this mean we have come full circle on the von Neumann revolution, and return to forecasting from pattern recognition rather than physics?…”
Section: Learning Physics From Datamentioning
confidence: 99%
“…RNNs are networks in which, unlike ANNs, information moves forward and backward between the input, deeper layers, and output. RNNs account for memory between sequential data points and are often used for time series forecasting (Chattopadhyay, Hassanzadeh, & Subramanian, 2020; Goodfellow et al, 2016). We use a gated recurrent unit (GRU) as the RNN (see Appendix B for details).…”
Section: The Numerical and Data‐driven (Super‐)parameterized Modelsmentioning
confidence: 99%
“…Although the focus of this paper is on subgrid‐scale modeling, we have also investigated the performance of a fully DD model. In this model, following Chattopadhyay, Hassanzadeh, & Subramanian (2020), an RNN is trained on X , which is then used to predict the spatiotemporal evolution of X from an initial condition. The RNN we use here is a GRU (see Appendix C), trained on 10 6 sequential values of X ( t ) sampled at every 10Δ t from the DNS data to mimic the time step used in all other low‐resolution models.…”
Section: The Numerical and Data‐driven (Super‐)parameterized Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…The L96 model has rapidly gained relevance among geoscientists, physicists, and applied mathematicians, as it has become a benchmark testbed for parametrisations [40][41][42][43][44][45][46], for studying extreme events [47][48][49][50], for developing data assimilation schemes [51][52][53][54], for developing ensemble forecasting techniques [55][56][57], for studying the properties of Lyapunov exponents and covariant Lyapunov vectors [58][59][60][61], for developing and testing ideas in nonequilibrium statistical mechanics [62][63][64][65][66], and for investigating bifurcations [67][68][69][70][71][72][73]. By looking at these references, the reader can find a very thorough analysis of the properties of the L96 model.…”
Section: The Lorenz '96 Modelmentioning
confidence: 99%