Data Assimilation (DA) is a methodology for combining mathematical models simulating complex systems (the background knowledge) and measurements (the reality or observational data) in order to improve the estimate of the system state. This is a large scale ill posed inverse problem then in this note we consider the Tikhonov-regularized variational formulation of 3D-DA problem, namely the so-called 3D-Var DA problem. We review two Domain Decomposition (DD) approches, namely the functional DD and the discrete Multiplicative Parallel Schwarz, and as the 3D-Var DA problem is a least square problem, we prove the equivalence between these approches.the Euler-Lagrange equations arising from the VarDA minimization problem. We prove equivalence between these approaches. The note is organized as follows: in section 2 we briefly review DA inverse problem and its variational formulation [12]; in section 3 we apply the DD approches and we prove the main result.
Summary
Kalman filter (KF) is one of the most important and common estimation algorithms. We introduce an innovative designing of Kalman filter algorithm based on domain decomposition (we call it DD‐KF). DD‐KF involves decomposition of the whole computational problem, partitioning of the solution and a slight modification of KF algorithm allowing a correction at run‐time of local solutions. The resulted parallel algorithm consists of concurrent copies of KF algorithm, each one requiring the same amount of computations on each subdomain and an exchange of boundary conditions between adjacent subdomains. Main advantage of this approach is that it can be potentially applied in a moderately nonintrusive manner to existing codes for tracking and controlling systems in location, navigation, in computer graphics and in much more state estimation problems. To highlight the capability of DD‐KF of exploiting the computing power provided by future designs of microprocessors based on multi/many‐cores CPU/GPU technologies, we consider DD both at physical core level and at microprocessor level and we discuss scalability of DD‐KF algorithm at coarse and fine grained level. Throughout the present work, we derive and discuss DD‐KF algorithm for solving constrained least square model, which underlies any data sampling and estimation problem.
We focus on Partial Differential Equation (PDE)-based Data Assimilation problems (DA) solved by means of variational approaches and Kalman filter algorithm. Recently, we presented a Domain Decomposition framework (we call it DD-DA, for short) performing a decomposition of the whole physical domain along space and time directions, and joining the idea of Schwarz's methods and parallel in time approaches. For effective parallelization of DD-DA algorithms, the computational load assigned to subdomains must be equally distributed. Usually computational cost is proportional to the amount of data entities assigned to partitions. Good quality partitioning also requires the volume of communication during calculation to be kept at its minimum. In order to deal with DD-DA problems where the observations are nonuniformly distributed and general sparse, in the present work we employ a parallel load balancing algorithm based on adaptive and dynamic defining of boundaries of DD-which is aimed to balance workload according to data location. We call it DyDD. As the numerical model underlying DA problems arising from the so-called discretize-then-optimize approach is the constrained least square model (CLS), we will use CLS as a reference state estimation problem and we validate DyDD on different scenarios.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.