2022
DOI: 10.48550/arxiv.2207.00521
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Using Machine Learning to Anticipate Tipping Points and Extrapolate to Post-Tipping Dynamics of Non-Stationary Dynamical Systems

Abstract: In this paper we consider the machine learning (ML) task of predicting tipping point transitions and long-term post-tipping-point behavior associated with the time evolution of an unknown (or partially unknown), non-stationary, potentially noisy and chaotic, dynamical system. We focus on the particularly challenging situation where the past dynamical state time series that is available for ML training predominantly lies in a restricted region of the state space, while the behavior to be predicted evolves on a … Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 67 publications
0
3
0
Order By: Relevance
“…In DS reconstruction, it is therefore important to check for geometrical and other time-invariant properties of the systems under study. For instance, measures based on the Kullback-Leibler divergence (Brenner et al, 2022; Koppe et al, 2019) or Wasserstein distance (Patel & Ott, 2022) have been used to assess the geometrical overlap in data point distributions in the large time limit across state spaces generated by the true and the reconstructed system. The maximal Lyapunov exponent or the so-called correlation dimension, an empirical estimate of the fractal dimensionality of an attractor, are other examples of such invariant measures (Kantz & Schreiber, 2004; see also Gilpin, 2020).…”
Section: Reconstructing Computational Dynamics From Time Series Datamentioning
confidence: 99%
See 2 more Smart Citations
“…In DS reconstruction, it is therefore important to check for geometrical and other time-invariant properties of the systems under study. For instance, measures based on the Kullback-Leibler divergence (Brenner et al, 2022; Koppe et al, 2019) or Wasserstein distance (Patel & Ott, 2022) have been used to assess the geometrical overlap in data point distributions in the large time limit across state spaces generated by the true and the reconstructed system. The maximal Lyapunov exponent or the so-called correlation dimension, an empirical estimate of the fractal dimensionality of an attractor, are other examples of such invariant measures (Kantz & Schreiber, 2004; see also Gilpin, 2020).…”
Section: Reconstructing Computational Dynamics From Time Series Datamentioning
confidence: 99%
“…However, as far as DS reconstruction is concerned, this topic has not been much explored yet. It is as yet unclear whether and how DS can be retrieved from time series measurements which entail slow parameter drifts (Patel & Ott, 2022). The problem here is partly related to the exploding/vanishing gradient issue briefly raised in sect.…”
Section: Outlook and Future Challengesmentioning
confidence: 99%
See 1 more Smart Citation