2020
DOI: 10.1063/5.0023764
|View full text |Cite
|
Sign up to set email alerts
|

Predicting critical transitions in multiscale dynamical systems using reservoir computing

Abstract: We study the problem of predicting rare critical transition events for a class of slow–fast nonlinear dynamical systems. The state of the system of interest is described by a slow process, whereas a faster process drives its evolution and induces critical transitions. By taking advantage of recent advances in reservoir computing, we present a data-driven method to predict the future evolution of the state. We show that our method is capable of predicting a critical transition event at least several numerical t… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 19 publications
(15 citation statements)
references
References 60 publications
0
15
0
Order By: Relevance
“…In this limit, it is unnecessary to vary s, N, and σ 2 A independently when selecting reservoir parameters, which is often done in literature, see for example [5,13]. For a given input series, the reservoir dynamics thus only depends on two parameters, namely sNσ 2 A and nσ 2 in . In the remainder of the article, these two parameters are used to investigate parameter regions where reservoir computing is successful.…”
Section: Parametersmentioning
confidence: 99%
See 2 more Smart Citations
“…In this limit, it is unnecessary to vary s, N, and σ 2 A independently when selecting reservoir parameters, which is often done in literature, see for example [5,13]. For a given input series, the reservoir dynamics thus only depends on two parameters, namely sNσ 2 A and nσ 2 in . In the remainder of the article, these two parameters are used to investigate parameter regions where reservoir computing is successful.…”
Section: Parametersmentioning
confidence: 99%
“…This is the same result as [7], for relaxed assumptions on the input time series. To obtain ⟨D 2 ii ⟩, we use the same procedure as [7] and construct an iterative map for the variance of the reservoir states r i (t). Assuming that N is large enough so that the sum ∑ N j=1 A ij r j is normally distributed, we can compute the probability density function f b (x) of the local field by using the convolution of the probability mass function of a normal distribution with zero mean and variance sNσ 2 A σ 2 r , and the empirical probability mass function of the normalized input time series, given an ensemble of input trajectories initialized with random initial values, scaled by σ 2 in , to construct an iterative map of the variance of r i (t) taken over input samples and ensembles of A and W (in) ,…”
Section: Maximal Lyapunov Exponentmentioning
confidence: 99%
See 1 more Smart Citation
“…Reservoir computing has recently become popular to study in this context, as it yields simple models of such dynamical systems. By exploiting signal-driven synchronization, where the dynamics of the reservoir neurons synchronizes with the input time series, a reservoir computer can be trained to reproduce a time series autonomously [1,2,3,4]. A necessary condition for the synchronization to occur is that the dynamics of the reservoir neurons be contractive; a property ensured by the reservoir dynamics having a negative maximal Lyapunov exponent.…”
Section: Introductionmentioning
confidence: 99%
“…In particular, after the seminal work of Pathak et al 4,5 , where reservoir computing techniques were used in order to reconstruct 4 and predict 5 complex spatiotemporal dynamics, many important ML-based applications in nonlinear dynamics have been developed. For example, ML has been successfully used for the prediction of extreme events 6 , critical transitions 7 and regime changes 8 , approximation of the Koopman operator 9 , distinguishing regural from chaotic behavior 10 , and identification of chimera states 11 , just to mention a few. One of the main reasons explaining the impressive amount of such contributions, is that the ML framework allows for model-free analysis, without the need of actually knowing the underlying model responsible for the generation of the observed data.…”
Section: Introductionmentioning
confidence: 99%