2023
DOI: 10.1109/tpds.2022.3223512
|View full text |Cite
|
Sign up to set email alerts
|

Level-Based Blocking for Sparse Matrices: Sparse Matrix-Power-Vector Multiplication

Abstract: The multiplication of a sparse matrix with a dense vector (SpMV) is a key component in many numerical schemes and its performance is known to be severely limited by main memory access. Several numerical schemes require the multiplication of a sparse matrix polynomial with a dense vector which is typically implemented as a sequence of SpMVs. This results in low performance and ignores the potential to increase the arithmetic intensity by reusing the matrix data from cache. In this work we use the recursive alge… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(2 citation statements)
references
References 39 publications
0
2
0
Order By: Relevance
“…For a sparse graph G , the maximum mean degree is denoted as () mad G and satisfies the condition that . In this article, there are other less common symbols that are consistent with the notation in Bondy and Murty's book [6][7][8][9][10].…”
mentioning
confidence: 83%
“…For a sparse graph G , the maximum mean degree is denoted as () mad G and satisfies the condition that . In this article, there are other less common symbols that are consistent with the notation in Bondy and Murty's book [6][7][8][9][10].…”
mentioning
confidence: 83%
“…Determining the exact number of computations performed by these methods is a complex task, further complicated by the fact that sparse matrix-vector multiplications are known to be bandwidth-limited in terms of performance Alappat et al [2022], Huber et al [2020]. Therefore, we adopt a simplified model that focuses exclusively on counting the loading and storing of elements.…”
Section: Computational Cost and Memory Usagementioning
confidence: 99%