2020
DOI: 10.1137/19m1267246
|View full text |Cite
|
Sign up to set email alerts
|

Physics-Informed Probabilistic Learning of Linear Embeddings of Nonlinear Dynamics with Guaranteed Stability

Abstract: The Koopman operator has emerged as a powerful tool for the analysis of nonlinear dynamical systems as it provides coordinate transformations which can globally linearize the dynamics. Recent deep learning approaches such as Linearly-Recurrent Autoencoder Networks (LRAN) show great promise for discovering the Koopman operator for a general nonlinear dynamical system from a data-driven perspective, but several challenges remain. In this work, we formalize the problem of learning the continuous-time Koopman oper… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
72
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 121 publications
(72 citation statements)
references
References 74 publications
0
72
0
Order By: Relevance
“…In effect, we create a model that is more "stable" and "general" than previous results, two key prerequisites for routine use in atmospheric modeling. We do this by leveraging three key innovations: 1) the use of a recurrent training regime (Brenowitz & Bretherton, 2018;McGibbon & Bretherton, 2019) to train the model to represent chemical reactions across multiple time scales, 2) the use of an encoder-operator-decoder framework to reduce the dimensionality of the chemical system (Lee & Carlberg, 2018;Pan & Duraisamy, 2019;Regazzoni et al, 2019), and 3) the use of a weighted optimization metric to create surrogate models that specialize in the prediction of key air quality metrics such as ozone and total particulate matter (PM).…”
Section: Introductionmentioning
confidence: 99%
“…In effect, we create a model that is more "stable" and "general" than previous results, two key prerequisites for routine use in atmospheric modeling. We do this by leveraging three key innovations: 1) the use of a recurrent training regime (Brenowitz & Bretherton, 2018;McGibbon & Bretherton, 2019) to train the model to represent chemical reactions across multiple time scales, 2) the use of an encoder-operator-decoder framework to reduce the dimensionality of the chemical system (Lee & Carlberg, 2018;Pan & Duraisamy, 2019;Regazzoni et al, 2019), and 3) the use of a weighted optimization metric to create surrogate models that specialize in the prediction of key air quality metrics such as ozone and total particulate matter (PM).…”
Section: Introductionmentioning
confidence: 99%
“…As shown in table 2, nearly half of the accurate KDMD eigenmodes identified are removed with the proposed sparse feature selection. Note that for all three cases, the number of selected modes (around 32 to 34) is still larger than that required in neural network models (around 10) (Otto & Rowley 2019;Pan & Duraisamy 2020). This is because the subspace spanned by KDMD/EDMD relies on a predetermined dictionary rather than being data-adaptive like neural network models.…”
Section: Results Of Discrete-time Kdmd With Mode Selectionmentioning
confidence: 95%
“…Although the use of a kernel defines an infinite-dimensional feature space, the resulting finite number of effective features can still be affected by both the type of kernel and the hyperparameters in the kernel, as clearly shown by Kutz et al (2016). Compared with EDMD/KDMD, which are based on a fixed dictionary of features, neural network approaches (Lusch et al 2018;Otto & Rowley 2019;Pan & Duraisamy 2020) have the potential to be more expressive in searching for a larger Koopman-invariant subspace. From a kernel viewpoint (Cho & Saul 2009), feedforward neural networks enable adaptation of the kernel function to the data.…”
Section: Choice Of Dictionary (For Edmd) or Kernel (For Kdmd)mentioning
confidence: 99%
“…However, the machine learning methods and in particular deep learning models lack interpretability and are prone to produce physically inconsistent results. Hence, there is an active research going on to incorporate the physical models in machine learning methods to make them physically consistent, such as loss regularization based on physical laws [14], designing novel neural network to embed certain physical properties [15], and building hybrid models to correct the imperfect knowledge in physical models [16,17]. How should we inject physics and domain knowledge into machine learning models?…”
Section: Introductionmentioning
confidence: 99%