2020
DOI: 10.1002/nla.2343
|View full text |Cite
|
Sign up to set email alerts
|

Low synchronization Gram–Schmidt and generalized minimal residual algorithms

Abstract: Summary The Gram–Schmidt process uses orthogonal projection to construct the A = QR factorization of a matrix. When Q has linearly independent columns, the operator P = I − Q(QTQ)−1QT defines an orthogonal projection onto Q⊥. In finite precision, Q loses orthogonality as the factorization progresses. A family of approximate projections is derived with the form P = I − QTQT, with correction matrix T. When T = (QTQ)−1, and T is triangular, it is postulated that the best achievable orthogonality is 𝒪(ε)κ(A). We … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
57
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 28 publications
(57 citation statements)
references
References 36 publications
0
57
0
Order By: Relevance
“…Although somewhat outside the scope of this review, we can demonstrate that it is possible to modify the Gratton et al (2020) analysis based on the inverse compact WY form of the MGS algorithm, introduced by Świrydowicz et al (2020). Rather than treat all of the inner products in the MGS-GMRES algorithm equally, consider the strictly upper triangular matrix U=LT from the loss of orthogonality relations.…”
Section: Sparse Linear Algebramentioning
confidence: 95%
See 1 more Smart Citation
“…Although somewhat outside the scope of this review, we can demonstrate that it is possible to modify the Gratton et al (2020) analysis based on the inverse compact WY form of the MGS algorithm, introduced by Świrydowicz et al (2020). Rather than treat all of the inner products in the MGS-GMRES algorithm equally, consider the strictly upper triangular matrix U=LT from the loss of orthogonality relations.…”
Section: Sparse Linear Algebramentioning
confidence: 95%
“…Barlow then extends this approach to the block compact WY form of MGS; see also the technical report by Sun (1996). The contribution by Świrydowicz et al (2020) was to note that there exists an inverse compact WY representation for MGS—having the projector P with lower triangular correction matrix T :…”
Section: Sparse Linear Algebramentioning
confidence: 99%
“…In the case of CGS, CGS-2 (classical Gram-Schmidt with reorthogonalization) corrects the projection by reorthogonalizing the vectors of Q and thereby reduces the loss of orthogonality to O(ε) [10,20]. The form of the correction matrix for this algorithm was derived in [23] recently introduced, requiring a triangular solve instead of the recursive construction of the correction matrix T from a block-triangular inverse (presented in Section 3.2). The required number of dot products per iteration is effectively reduced to one for GMRES and s-step Krylov solvers [23,25].…”
Section: Qr Factorization With Gram-schmidtmentioning
confidence: 99%
“…The form of the correction matrix for this algorithm was derived in [23] recently introduced, requiring a triangular solve instead of the recursive construction of the correction matrix T from a block-triangular inverse (presented in Section 3.2). The required number of dot products per iteration is effectively reduced to one for GMRES and s-step Krylov solvers [23,25]. The inverse compact W Y algorithm maintains O(ε)κ(A) loss of orthogonality.…”
Section: Qr Factorization With Gram-schmidtmentioning
confidence: 99%
See 1 more Smart Citation