2022
DOI: 10.1098/rspa.2022.0182
|View full text |Cite
|
Sign up to set email alerts
|

A low-rank ensemble Kalman filter for elliptic observations

Abstract: We propose a regularization method for ensemble Kalman filtering (EnKF) with elliptic observation operators. Commonly used EnKF regularization methods suppress state correlations at long distances. For observations described by elliptic partial differential equations, such as the pressure Poisson equation (PPE) in incompressible fluid flows, distance localization should be used cautiously, as we cannot disentangle slowly decaying physical interactions from spurious long-range correlations. This is particularly… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5

Relationship

2
3

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 36 publications
0
5
0
Order By: Relevance
“…In the latter context – for example, in an ensemble Kalman filter (da Silva & Colonius 2018; Darakananda & Eldredge 2019; Le Provost & Eldredge 2021; Le Provost et al. 2022) – the inference comprises the analysis part of every step, when data are assimilated into the prediction. Some of the challenges and uncertainty of the static inference identified in this paper are overcome with advancing time as the sensors observe the evolving configuration of the flow and the forecast model predicts the state's evolution.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…In the latter context – for example, in an ensemble Kalman filter (da Silva & Colonius 2018; Darakananda & Eldredge 2019; Le Provost & Eldredge 2021; Le Provost et al. 2022) – the inference comprises the analysis part of every step, when data are assimilated into the prediction. Some of the challenges and uncertainty of the static inference identified in this paper are overcome with advancing time as the sensors observe the evolving configuration of the flow and the forecast model predicts the state's evolution.…”
Section: Discussionmentioning
confidence: 99%
“…(2018), Le Provost & Eldredge (2021) and Le Provost et al. (2022), in which pressure measurements were assimilated into the estimate of the state via an ensemble Kalman filter (Evensen 1994). Each step of such sequential data assimilation consists of the same Bayesian inference (or analysis) procedure: we start with an initial guess for the probability distribution (the prior) and seek an improved guess (the posterior).…”
Section: Introductionmentioning
confidence: 99%
“…In this context, hydrodynamic and aerodynamic systems-owing to their highly nonlinear dynamics-are particularly challenging applications for data-driven approaches [2,3]. For twodimensional flows, simplified physical models were successfully applied to estimate flow states from sparse sensor data [4,5]. Furthermore, a series of recent studies addressed the problem of flow-state reconstruction from a data-driven perspective, demonstrating the capability of datadriven approaches for estimating complex, three-dimensional and turbulent flow states [6][7][8][9][10][11][12][13][14][15][16][17].…”
Section: Introductionmentioning
confidence: 99%
“…Recent successful examples of ML algorithms that have been modified to respect physical principles include neural networks [2][3][4][5][6][7][8][9][10], kernel methods [11,12], deep generative models [13], data assimilation [14,15] and sparse regression [16][17][18][19][20]. These examples demonstrate that incorporating partially known physical principles into ML architectures can increase the accuracy, robustness and generalizability of the resulting models, while simultaneously decreasing the required training data.…”
Section: Introductionmentioning
confidence: 99%
“…Physical principles—such as conservation laws, symmetries and invariances—can be incorporated into ML algorithms in the form of inductive biases, thereby ensuring that the learned models are constrained to the correct physics. Recent successful examples of ML algorithms that have been modified to respect physical principles include neural networks [2–10], kernel methods [11,12], deep generative models [13], data assimilation [14,15] and sparse regression [1620]. These examples demonstrate that incorporating partially known physical principles into ML architectures can increase the accuracy, robustness and generalizability of the resulting models, while simultaneously decreasing the required training data.…”
Section: Introductionmentioning
confidence: 99%