2014
DOI: 10.1137/120895755
|View full text |Cite
|
Sign up to set email alerts
|

Superfast and Stable Structured Solvers for Toeplitz Least Squares via Randomized Sampling

Abstract: We present some superfast (O((m + n) log 2 (m + n)) complexity) and stable structured direct solvers for m × n Toeplitz least squares problems. Based on the displacement equation, a Toeplitz matrix T is first transformed into a Cauchy-like matrix C, which can be shown to have small off-diagonal numerical ranks when the diagonal blocks are rectangular. We generalize standard hierarchically semiseparable (HSS) matrix representations to rectangular ones, and construct a rectangular HSS approximation to C in nearl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
80
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
6
1
1

Relationship

5
3

Authors

Journals

citations
Cited by 64 publications
(82 citation statements)
references
References 34 publications
2
80
0
Order By: Relevance
“…Note that for arbitrary k-space trajectories, the Gram matrix , or the coefficient matrix of (17), has a block Toeplitz structure [25]. A number of off-the-shelf numerical solvers taking advantage of this structure enables very efficient solution of (17), including the pre-conditioned CG solvers (e.g., [26], [27]) and hierarchically semiseparable (HSS) solvers (e.g., [28], [29]). (5) is a non-convex optimization problem, the solution of the algorithm generally depends on the initialization.…”
Section: Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…Note that for arbitrary k-space trajectories, the Gram matrix , or the coefficient matrix of (17), has a block Toeplitz structure [25]. A number of off-the-shelf numerical solvers taking advantage of this structure enables very efficient solution of (17), including the pre-conditioned CG solvers (e.g., [26], [27]) and hierarchically semiseparable (HSS) solvers (e.g., [28], [29]). (5) is a non-convex optimization problem, the solution of the algorithm generally depends on the initialization.…”
Section: Algorithmmentioning
confidence: 99%
“…Solving each subproblem only involves inverting a diagonal matrix, which can be done in (N) operations. Regarding the linear least-squares problem in (10), it is decoupled with respect to each time frame and each coil, and each subproblem can be solved by the HSS solver in (N log 2 N) operations [28].…”
Section: )mentioning
confidence: 99%
“…Despite the large number of references available on fast algorithms for computations with low-rank structured matrices, there exist very few references on the corresponding a priori rounding error analyses [1,2,4,14,18,19]. Some reasons of this might be that such fast algorithms are frequently complicated and that some of them are potentially unstable, although work well in practice most of the times.…”
Section: Introductionmentioning
confidence: 99%
“…Once an HSS approximation to A is constructed, the well-established fast HSS factorization and solution algorithms in [7,34,37] can be applied.…”
Section: Randomized Hss Construction: Adaptive and Matrix-free Schemesmentioning
confidence: 99%
“…[34] that such type of HSS factorizations usually have much better stability than standard dense factorizations (for the same matrix) due to the hierarchical structure and the use of orthogonal off-diagonal operations.…”
Section: Or (48)mentioning
confidence: 99%