2021
DOI: 10.1098/rspa.2021.0326
|View full text |Cite
|
Sign up to set email alerts
|

Simple, low-cost and accurate data-driven geophysical forecasting with learned kernels

Abstract: Modelling geophysical processes as low-dimensional dynamical systems and regressing their vector field from data is a promising approach for learning emulators of such systems. We show that when the kernel of these emulators is also learned from data (using kernel flows, a variant of cross-validation), then the resulting data-driven models are not only faster than equation-based models but are easier to train than neural networks such as the long short-term memory neural network. In addition, they are also mor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 13 publications
(6 citation statements)
references
References 48 publications
0
6
0
Order By: Relevance
“…We have focused the manuscript on the situation where the kernels of the underlying GPs are given/pre-determined. Using data-driven kernels can improve the accuracy of kernel methods by orders of magnitude [34,5,16,15,10,21,35]. There are essentially three categories of methods for learning kernel from data: variants of cross-validation (such as Kernel Flows [34]), maximum likelihood estimation (see [5] for a comparison between Kernel Flows and MLE), and maximum a posteriori estimation [30].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…We have focused the manuscript on the situation where the kernels of the underlying GPs are given/pre-determined. Using data-driven kernels can improve the accuracy of kernel methods by orders of magnitude [34,5,16,15,10,21,35]. There are essentially three categories of methods for learning kernel from data: variants of cross-validation (such as Kernel Flows [34]), maximum likelihood estimation (see [5] for a comparison between Kernel Flows and MLE), and maximum a posteriori estimation [30].…”
Section: Discussionmentioning
confidence: 99%
“…Remark 5.1. Although this (classical) approach may appear simple, it can be highly effective when the underlying kernel is also learned from data [16,15]. Considering the extrapolation of time series obtained from satellite data as an example, this simple dataadapted kernel perspective outperforms (both in complexity and accuracy) PDE-based and ANN-based methods [15].…”
Section: Interpolationmentioning
confidence: 99%
“…(1) One could use the partial differential equations and boundary conditions governing mantle convection to place soft and/or hard constraints on the optimization problem at hand [33,[69][70][71]. (2) Although POD bases are not appropriate for this problem because they do not generalize well among simulations with different parameters, the idea of finding a set of basis functions, that one needs only to learn the coefficients to, remains an attractive one [72][73][74].…”
Section: Discussionmentioning
confidence: 99%
“…which establishes (40). The identity P g(h(•), h(x)) = g(•, h(x)) employed in (41) follows from observing that the identity g(•, h(x))…”
Section: Periodic Kernelsmentioning
confidence: 93%
“…When the underlying physics is unknown, the kernel can be learned from data via crossvalidation/maximum likelihood estimation in a given (possibly non-parametric) family of kernels [25,[37][38] . The kernel flow (a variant of cross-validation) approach [37] has been shown to be efficient for learning (possibly stochastic) dynamical systems [39][40][41][42][43] and designing surrogate models [44][45][46] . In particular, this approach has been shown to compare favorably to ANN-based methods (in terms of both complexity and accuracy) for weather/climate prediction using actual satellite data [40] .…”
Section: Numerical Experimentsmentioning
confidence: 99%