2003
DOI: 10.1007/978-3-540-39737-3_115
|View full text |Cite
|
Sign up to set email alerts
|

Minimizing Communication Cost in Fine-Grain Partitioning of Sparse Matrices

Abstract: Abstract. We show a two-phase approach for minimizing various communication-cost metrics in fine-grain partitioning of sparse matrices for parallel processing. In the first phase, we obtain a partitioning with the existing tools on the matrix to determine computational loads of the processor. In the second phase, we try to minimize the communicationcost metrics. For this purpose, we develop communication-hypergraph and partitioning models. We experimentally evaluate the contributions on a PC cluster.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0

Year Published

2004
2004
2020
2020

Publication Types

Select...
3
3

Relationship

5
1

Authors

Journals

citations
Cited by 9 publications
(14 citation statements)
references
References 14 publications
0
14
0
Order By: Relevance
“…Some works on 2D models do not take the communication volume into account, however they provide an upper bound on the number of messages communicated [40][41][42][43][44] . On the other hand, there are 2D models that aim at reducing volume, with or without providing a bound on the maximum number of messages [33,[45][46][47][48][49][50] . 2D partitioning models in the literature can further be categorized into three classes: checkerboard partitioning [47,49,50] (also known as coarse-grain partitioning), jagged partitioning [45,49] and fine-grain partitioning [46,4 8,4 9] .…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Some works on 2D models do not take the communication volume into account, however they provide an upper bound on the number of messages communicated [40][41][42][43][44] . On the other hand, there are 2D models that aim at reducing volume, with or without providing a bound on the maximum number of messages [33,[45][46][47][48][49][50] . 2D partitioning models in the literature can further be categorized into three classes: checkerboard partitioning [47,49,50] (also known as coarse-grain partitioning), jagged partitioning [45,49] and fine-grain partitioning [46,4 8,4 9] .…”
Section: Related Workmentioning
confidence: 99%
“…These models aim at reducing latency costs usually at the expense of increasing bandwidth costs. Similar models have been investigated [48,53] , but they are based on 1D and 2D fine-grain models. (ii) All proposed and investigated partitioning models are realized on two iterative methods CGNE and CGNR implemented with the widely adopted PETSc toolkit [59] .…”
Section: Motivation and Contributionsmentioning
confidence: 99%
See 1 more Smart Citation
“…Finding a partition on the vectors x and y is referred to as the vector partitioning operation, and it can be performed in three different ways: by decoding the partition given on A [2]; in a post-processing step using the partition on the matrix [7,8,13]; or explicitly partitioning the vectors during partitioning the matrix [9]. In any of these cases, the vector partitioning for matrix-vector operations is called symmetric if x and y have the same partition, and non-symmetric otherwise.…”
Section: Parallel Matrix-vector Multiply Algorithmsmentioning
confidence: 99%
“…During the last decade, several successful hypergraph-based models and methods were proposed for sparse matrix partitioning [1][2][3][4][5][6][7][8][9][10]. These models and methods have gained wide acceptance in the literature for efficient parallelization of sparse matrix-vector multiply operations.…”
Section: Introductionmentioning
confidence: 99%