2004
DOI: 10.1007/s00607-004-0080-4
|View full text |Cite
|
Sign up to set email alerts
|

Hierarchical Matrices Based on a Weak Admissibility Criterion

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
86
0

Year Published

2005
2005
2023
2023

Publication Types

Select...
8

Relationship

2
6

Authors

Journals

citations
Cited by 81 publications
(88 citation statements)
references
References 14 publications
2
86
0
Order By: Relevance
“…§2.1) defined on a rectangle with the only singularity at the origin. The identical problem appeared in the study of weakly admissible clusters in the theory of H-matrices (see [22]). A separable approximation of F with r terms will lead to a Kronecker tensor-product approximation of the matrix with Kronecker rank r. The interior structure of the Kronecker factors will however depend on the method by which they are constructed.…”
Section: Introductionmentioning
confidence: 93%
See 2 more Smart Citations
“…§2.1) defined on a rectangle with the only singularity at the origin. The identical problem appeared in the study of weakly admissible clusters in the theory of H-matrices (see [22]). A separable approximation of F with r terms will lead to a Kronecker tensor-product approximation of the matrix with Kronecker rank r. The interior structure of the Kronecker factors will however depend on the method by which they are constructed.…”
Section: Introductionmentioning
confidence: 93%
“…• The matrix-by-vector complexity with the Kronecker-tensor ansatz (1.1) is outperformed asymptotically by the almost 2 linear estimates which are typical for H-matrices (see [17,18,19,22,23]) or the mosaic-skeleton method (cf. [31,32,33]) as well as the earlier well-known methods such as panel clustering [24], multipole [28,30], and interpolation using regular or hierarchical grids (cf.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…However, in numerical tests better results have been obtained according to the computational time using larger η and correspondingly having larger but fewer admissible subblocks. In [40] better results have been obtained as well when even subblocks s × t with s = t were accepted as admissible. In numerical tests η := 50 has been a good choice and this value has been used in the rest of the work.…”
Section: Task (T1): Construction Of the H-matricesmentioning
confidence: 99%
“…The covariance matrix C is not sparse and, in general, requires O(n 2 ) units of memory for the storage and O(n 2 ) FLOPS for the matrix-vector multiplication. In this section it will be shown how to approximate general covariance matrices with the H-matrix format [17,18,20,22]. The H-matrix technique is nothing but a hierarchical division of a given matrix into subblocks and further approximation of the majority of them by low-rank matrices (Fig.…”
Section: H-matrix Techniquementioning
confidence: 99%