2020
DOI: 10.1090/noti2151
|View full text |Cite
|
Sign up to set email alerts
|

Bridging Data Science and Dynamical Systems Theory

Abstract: Modern science is undergoing what might arguably be called a "data revolution," manifested by a rapid growth of observed and simulated data from complex systems, as well as vigorous research on mathematical and computational frameworks for data analysis. In many scientific branches, these efforts have led to the creation of statistical models of complex systems that match or exceed the skill of first-principles models. Yet, despite these successes, statistical models are oftentimes treated as black boxes, prov… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
28
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 19 publications
(28 citation statements)
references
References 18 publications
(23 reference statements)
0
28
0
Order By: Relevance
“…Such a spectral projection method (onto the data-driven basis constructed by eigen-spaces of the kernel integral operator) has been advocated and widely used in many applications. In the context of learning dynamical systems, see [4,5,1,18] and the references therein. Assume r N = rank(K N ) ≥ M, and denote { λj } r N j =1 as the set of all nonzero eigenvalues (in descending order, counting multiplicity) of K N with the corresponding normalized eigenvectors { û j } ⊂ R N , which form an orthonormal family under the inner product of 〈•, •〉 πδ…”
Section: Nyström Interpolationmentioning
confidence: 99%
See 1 more Smart Citation
“…Such a spectral projection method (onto the data-driven basis constructed by eigen-spaces of the kernel integral operator) has been advocated and widely used in many applications. In the context of learning dynamical systems, see [4,5,1,18] and the references therein. Assume r N = rank(K N ) ≥ M, and denote { λj } r N j =1 as the set of all nonzero eigenvalues (in descending order, counting multiplicity) of K N with the corresponding normalized eigenvectors { û j } ⊂ R N , which form an orthonormal family under the inner product of 〈•, •〉 πδ…”
Section: Nyström Interpolationmentioning
confidence: 99%
“…Among the linear estimators, a popular approach is the kernelbased method [9,42,17,51,35,10,11], whose connection to the parametric modeling paradigm has been studied in [31]. In this direction, many nonparametric models have been proposed, including the orthogonal polynomials [51], wavelets [42], Gaussian processes [17], radial kernels [17], diffusion maps based models [4,5,18], just to name a few. Beyond the kernel approaches, the neural-network approach has been applied to estimate the drift coefficient [33] with application in biomolecular modeling, and the missing component in the drift term [26] with application to modeling atmospheric flow over topography.…”
Section: Introductionmentioning
confidence: 99%
“…Using these models in combination with other traditional models compromises the trustworthiness of the overall system. Some of the active areas of research in this context are: cost function modification to accommodate the model Jacobian [185], grow‐when‐required network [208], physics‐informed neural networks (PINNs) [247,276,365], embedding hard physical constraints in a neural network [230], leveraging uncertainty information [189], developing visualization tool for the network analysis [291], and nonparametric modeling approaches for bridging data science and dynamical systems [46]. While these techniques have been emerging in both scientific computing and ML fields, they offer many opportunities to fuse topics in numerical linear algebra and theoretical computer science [122] toward improving the explainability of these models and implementing built‐in sanity checks on the overall system where these models are employed.…”
Section: Introductionmentioning
confidence: 99%
“…Indeed, starting from early spectral approximation techniques for Koopman 12,13 and transfer [14][15][16] operators in the 1990s, there has been vigorous research on operator-theoretic approaches applicable to broad classes of autonomous [17][18][19][20][21][22] and non-autonomous systems [23][24][25][26][27][28] . In addition, recently developed methods [29][30][31][32][33][34][35][36] combine Koopman and transfer operator theory with kernel methods for machine learning [37][38][39] to yield data-driven algorithms adept at approximating evolution operators and their spectra.…”
Section: Introductionmentioning
confidence: 99%