2021
DOI: 10.1016/j.jcp.2020.109922
|View full text |Cite
|
Sign up to set email alerts
|

Machine learning for prediction with missing dynamics

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
66
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 61 publications
(66 citation statements)
references
References 45 publications
0
66
0
Order By: Relevance
“…Secondly, in the model development under various situations, e.g., when additional couplings need to be included, the issue of model closure becomes important. This has to be done in a dynamic manner, and one of the possible routes for completing this task lies through a reformulation of the problem as a supervised machine learning (SML) process [216]. In this context, we would like to mention the Nakajima-Zwanzig equation which belongs to the Mori-Zwanzig theory within the statistical mechanics of irreversible processes.…”
Section: Modelling With Nonlocality In Data-driven Environmentsmentioning
confidence: 99%
“…Secondly, in the model development under various situations, e.g., when additional couplings need to be included, the issue of model closure becomes important. This has to be done in a dynamic manner, and one of the possible routes for completing this task lies through a reformulation of the problem as a supervised machine learning (SML) process [216]. In this context, we would like to mention the Nakajima-Zwanzig equation which belongs to the Mori-Zwanzig theory within the statistical mechanics of irreversible processes.…”
Section: Modelling With Nonlocality In Data-driven Environmentsmentioning
confidence: 99%
“…Data-driven approaches, which are based on statistical learning methods, provide useful and practical tools for model reduction. The past decades witness revolutionary developments of data-driven strategies, ranging from parametric models (see, e.g., [ 8 , 9 , 10 , 11 , 12 , 13 , 14 ] and the references therein) to nonparametric and machine learning methods (see, e.g., [ 15 , 16 , 17 , 18 ]). These developments demand a systematic understanding of model reduction from the perspectives of dynamical systems (see, e.g., [ 7 , 19 , 20 ]), numerical approximation [ 21 , 22 ], and statistical learning [ 17 , 23 ].…”
Section: Introductionmentioning
confidence: 99%
“…When the nonlinearity is complicated, a linear-in-parameter ansatz may be out of reach. One can overcome this limitation by nonparametric techniques [ 23 , 29 ] and machine learning methods (see, e.g., [ 16 , 17 , 30 ]).…”
Section: Introductionmentioning
confidence: 99%
“…And we will use the multiscale Lorenz 96 model [28] to provide an example of an ODE to which the averaging principle may be applied to effect dimension reduction, as pioneered and exploited in [14]. The work of Jiang and Harlim [21] studies data-informed model-driven prediction in partially observed ODEs, using ideas from kernel based approximation; and in the paper [20] the idea is generalized to discrete time dynamical systems, and neural networks and LSTM modeling is used in place of kernel methods. In both the papers [21,20] multiscale systems are used to test their methods in certain regimes.…”
mentioning
confidence: 99%
“…The work of Jiang and Harlim [21] studies data-informed model-driven prediction in partially observed ODEs, using ideas from kernel based approximation; and in the paper [20] the idea is generalized to discrete time dynamical systems, and neural networks and LSTM modeling is used in place of kernel methods. In both the papers [21,20] multiscale systems are used to test their methods in certain regimes.…”
mentioning
confidence: 99%