2018
DOI: 10.1137/15m1055061
|View full text |Cite
|
Sign up to set email alerts
|

Bayes Meets Krylov: Statistically Inspired Preconditioners for CGLS

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
32
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 35 publications
(32 citation statements)
references
References 43 publications
0
32
0
Order By: Relevance
“…As many numerical methods rely on linear solvers, such as the CG method, understanding the error incurred by these numerical methods is critical. Other works to recently highlight the value of statistical thinking in this application area includes Calvetti et al [2018].…”
Section: Probabilistic Numerical Methodsmentioning
confidence: 99%
“…As many numerical methods rely on linear solvers, such as the CG method, understanding the error incurred by these numerical methods is critical. Other works to recently highlight the value of statistical thinking in this application area includes Calvetti et al [2018].…”
Section: Probabilistic Numerical Methodsmentioning
confidence: 99%
“…To solve high dimensional problems the algorithm should be implemented in a more efficient way. For instance, the slight instabilities appearing near the initial point in Figure 7 can be avoided by solving the overdetermined linear system (17) by Krylov methods [24]. Moreover, the Jacobian matrix J G can be compressed by usual techniques [25].…”
Section: Discussionmentioning
confidence: 99%
“…For the deterministic version of the ensemble Kalman filter [24], the ETKF which is illustrated in Algorithm 7, we again write the forecast error covariance in low rank form (33), with ensembles {x F k } r k=1 . It is then assumed that the state estimate x A is of the form x A = x F + X F w A , where w A is a vector of coefficients in the small dimensional ensemble subspace R r , and X F ∈ R n×r .…”
Section: Compute the Forecast Ensemble E F =  I (E A ) End Formentioning
confidence: 99%