2007
DOI: 10.1007/s00211-007-0114-x
|View full text |Cite
|
Sign up to set email alerts
|

Fast linear algebra is stable

Abstract: In [23] we showed that a large class of fast recursive matrix multiplication algorithms is stable in a normwise sense, and that in fact if multiplication of n-by-n matrices can be done by any algorithm in O(n ω+η ) operations for any η > 0, then it can be done stably in O(n ω+η ) operations for any η > 0. Here we extend this result to show that essentially all standard linear algebra operations, including LU decomposition, QR decomposition, linear equation solving, matrix inversion, solving least squares probl… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
200
0

Year Published

2007
2007
2023
2023

Publication Types

Select...
7
2
1

Relationship

4
6

Authors

Journals

citations
Cited by 169 publications
(200 citation statements)
references
References 46 publications
0
200
0
Order By: Relevance
“…Indeed, "block" algorithms relying on matrix multiplication are used in practice for many linear algebra operations [1,3], and have been shown to be stable assuming only the error bound (4) [12]. In a companion paper [11], we show that while stable these earlier block algorithms are not asymptotically as fast as matrix multiplication. However, [11] also shows there are variants of these block algorithms for operations like QR decomposition, linear equation solving and determinant computation that are both stable and as fast as matrix multiplication.…”
Section: Stability Of Linear Algebra Algorithms Based On Matrix Multimentioning
confidence: 91%
“…Indeed, "block" algorithms relying on matrix multiplication are used in practice for many linear algebra operations [1,3], and have been shown to be stable assuming only the error bound (4) [12]. In a companion paper [11], we show that while stable these earlier block algorithms are not asymptotically as fast as matrix multiplication. However, [11] also shows there are variants of these block algorithms for operations like QR decomposition, linear equation solving and determinant computation that are both stable and as fast as matrix multiplication.…”
Section: Stability Of Linear Algebra Algorithms Based On Matrix Multimentioning
confidence: 91%
“…For the symmetric eigenproblem and SVD, there are such algorithms that begin by reduction to a condensed form. But for the nonsymmetric eigenproblem, the only known algorithm attaining the expected lower bound does not initially reduce to condensed form, and is not based on QR iteration [DDH07,BDD11].…”
Section: Eigenvalue and Singular Value Problemsmentioning
confidence: 99%
“…Thus, using Cholesky-QR we plan to formulate many other numerical linear algebra operations with minimal communication. As an alternative, we are also looking into adjusting the algorithms for computing QR, eigenvalue decompositions, and the SVD which use Strassen's algorithm [8] to using our 2.5D matrix multiplication algorithm instead. Further, we plan to look for the most efficient and stable 2.5D QR factorization algorithms.…”
Section: Future Workmentioning
confidence: 99%