2000
DOI: 10.1137/s1052623497327854
|View full text |Cite
|
Sign up to set email alerts
|

Automatic Preconditioning by Limited Memory Quasi-Newton Updating

Abstract: The paper proposes a preconditioner for the conjugate gradient method (CG) that is designed for solving systems of equations Ax = b i with di erent right hand side vectors, or for solving a sequence of slowly varying systems A k x = b k . The preconditioner has the form of a limited memory quasi-Newton matrix and is generated using information from the CG iteration. The automatic preconditioner does not require explicit knowledge of the coe cient matrix A and is therefore suitable for problems where only produ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

2
163
0

Year Published

2003
2003
2020
2020

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 138 publications
(165 citation statements)
references
References 12 publications
2
163
0
Order By: Relevance
“…To check whether this inferior performance is a result of a poor preconditioner, or reflects a more general difficulty of TN methods, we carried out several minimizations for the loop with ε=1 using the hybrid method with l=0, i.e., applying pure HFN, which is expected to have a more advanced preconditioner than that of TN(Nash). 45 However, the shortest minimization of 137 seconds (maxcg=50) is very close to 140 sec obtained with TN(Nash) whereas using maxcg = 5, 20 and 70 required 142, 156, and 161 sec, respectively. To further investigate this problem it would be of interest to apply a TN method such as that provided by TNPACK where the preconditioning matrix can be tailored to the specific problem studied.…”
Section: As Inmentioning
confidence: 59%
See 1 more Smart Citation
“…To check whether this inferior performance is a result of a poor preconditioner, or reflects a more general difficulty of TN methods, we carried out several minimizations for the loop with ε=1 using the hybrid method with l=0, i.e., applying pure HFN, which is expected to have a more advanced preconditioner than that of TN(Nash). 45 However, the shortest minimization of 137 seconds (maxcg=50) is very close to 140 sec obtained with TN(Nash) whereas using maxcg = 5, 20 and 70 required 142, 156, and 161 sec, respectively. To further investigate this problem it would be of interest to apply a TN method such as that provided by TNPACK where the preconditioning matrix can be tailored to the specific problem studied.…”
Section: As Inmentioning
confidence: 59%
“…Indeed, Alekseev and Navon 44 have found the hybrid method to be the best performer as tested on cost functionals related to inverse problems in fluid dynamics (See also Ref. 45). …”
Section: Introductionmentioning
confidence: 99%
“…The iterative solver employed is the Conjugate Gradient (CG) method or its variants [4] and we propose its use in combination with limited memory preconditioners. A preconditioner is denoted as limited memory if it can be stored compactly in a few vectors of length m, and its product by a vector calls for scalar products and, possibly, sums of vectors [10]. The limited memory preconditioners studied in this work belong to both the class of Incomplete Cholesky factorizations and to the class of Quasi-Newton preconditioners.…”
Section: Introductionmentioning
confidence: 99%
“…Limited memory Quasi-Newton preconditioners are a class of matrices built drawing inspiration from Quasi-Newton schemes for convex quadratic programming [10]. Given a preconditioner (first-level preconditioner), Quasi-Newton preconditioners provide its update (second-level preconditioner) by exploiting a few vectors of dimension m, i.e., information belonging to a low-dimensional subspace of R m .…”
Section: Introductionmentioning
confidence: 99%
“…Using the same preconditioner over two or more similar linear solves is accepted practice [36], [37], [43]. For example, a Jacobian may be reused over several steps in a modified Newton method [31].…”
Section: Introductionmentioning
confidence: 99%