2011
DOI: 10.1002/qj.743
|View full text |Cite
|
Sign up to set email alerts
|

A reduced and limited‐memory preconditioned approach for the 4D‐Var data‐assimilation problem

Abstract: We recall a theoretical analysis of the equivalence between the Kalman filter and the four-dimensional variational (4D-Var) approach to solve data-assimilation problems. This result is then extended to cover the comparison of the singular evolutive extended Kalman (SEEK) filter with a reduced variant of the 4D-Var algorithm. We next concentrate on the solution of the 4D-Var, which is usually computed with a (truncated) Gauss-Newton algorithm using a preconditioned conjugate-gradient-like (CG) method. Motivated… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 7 publications
(10 citation statements)
references
References 37 publications
0
10
0
Order By: Relevance
“…Moreover, in an incremental 4D‐Var assimilation system, an approximate Hessian derived from a similar methodology could be used to precondition the CG minimization of the inner loops. In particular, the LAST_DIAG_CYC preconditioner could be used in the quasi‐Newton LMP approach proposed by Tshimanga et al () and Gratton et al () to further improve the convergence of the inner loops. Another interesting application of the BFGS method in variational data assimilation was suggested by Fisher and Courtier ().…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Moreover, in an incremental 4D‐Var assimilation system, an approximate Hessian derived from a similar methodology could be used to precondition the CG minimization of the inner loops. In particular, the LAST_DIAG_CYC preconditioner could be used in the quasi‐Newton LMP approach proposed by Tshimanga et al () and Gratton et al () to further improve the convergence of the inner loops. Another interesting application of the BFGS method in variational data assimilation was suggested by Fisher and Courtier ().…”
Section: Discussionmentioning
confidence: 99%
“…Note that the idea of using information from multiple inversions to improve the inverse Hessian estimate is not new. For example, the LMP described in Tshimanga et al () and Gratton et al (), when used in the context of incremental 4D‐Var data assimilation, can construct a BFGS approximation of the inverse Hessian from the sequence of quadratic CG minimizations of the outer loops, gradually improving the preconditioning of the systems. In the BFGS_HYBRID approach, the ensemble of minimization problems is obtained through random perturbations of the prior and observations in the 4D‐Var cost function.…”
Section: Numerical Experimentsmentioning
confidence: 99%
“…Gratton et al . () propose, for instance, to compute an approximate initial guess for the inner linear systems in a reduced space given by a spectral decomposition of the initial Hessian operator.…”
Section: Discussionmentioning
confidence: 99%
“…With a linear operator H , solving Equation is equivalent to solving the Kalman filter equation (Gratton et al. , ) and the solution of Equation is given by xa=xb+boldBHT()boldHboldBHT+boldR1()boldyboldHxb. …”
Section: Background‐error Covariance In Data Assimilationmentioning
confidence: 99%