2021
DOI: 10.48550/arxiv.2109.09703
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Learning to Forecast Dynamical Systems from Streaming Data

Abstract: Kernel analog forecasting (KAF) is a powerful methodology for data-driven, non-parametric forecasting of dynamically generated time series data. This approach has a rigorous foundation in Koopman operator theory and it produces good forecasts in practice, but it suffers from the heavy computational costs common to kernel methods. This paper proposes a streaming algorithm for KAF that only requires a single pass over the training data. This algorithm dramatically reduces the costs of training and prediction wit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 60 publications
(91 reference statements)
0
4
0
Order By: Relevance
“…An interesting avenue of research would be to study how to optimally recompute the dimensionality reduction step so that it requires the least amount of computational effort. Possible sources of ideas could come from landmarking as in [54] or [55], as well as from streaming PCA algorithms [56], or from [57]. The potential advantage of calculating diffusion map coordinates and working in the reduced space is that (a) the evaluation of the surrogate is cheaper in the reduced space, but also (b) the number of coordinates in which this surrogate is optimized is lower.…”
Section: Discussion and Outlookmentioning
confidence: 99%
“…An interesting avenue of research would be to study how to optimally recompute the dimensionality reduction step so that it requires the least amount of computational effort. Possible sources of ideas could come from landmarking as in [54] or [55], as well as from streaming PCA algorithms [56], or from [57]. The potential advantage of calculating diffusion map coordinates and working in the reduced space is that (a) the evaluation of the surrogate is cheaper in the reduced space, but also (b) the number of coordinates in which this surrogate is optimized is lower.…”
Section: Discussion and Outlookmentioning
confidence: 99%
“…However, a number of recent ideas for scaling kernel methods can be applied, see e.g. [17,33] and references therein. Perhaps more importantly, specific to the context of dynamical systems is the fact that an approximate mode decomposition needs to be further computed, which requires the spectral decomposition of G. As showed in App.…”
Section: Principal Component Regression (Pcr) a Standard Strategy To ...mentioning
confidence: 99%
“…See also measure-preserving EDMD [32]. Other data-driven methods include deep learning [84,86,88,98,141], reduced-order modeling [12,58], sparse identification of nonlinear dynamics [20,108], and kernel analog forecasting [25,55,143]. However, remaining challenges include the following: C1.…”
Section: Introductionmentioning
confidence: 99%