Proceedings of the 20th European MPI Users' Group Meeting 2013
DOI: 10.1145/2488551.2488552
|View full text |Cite
|
Sign up to set email alerts
|

MPI datatype processing using runtime compilation

Abstract: Data packing before and after communication can make up as much as 90% of the communication time on modern computers. Despite MPI's well-defined datatype interface for non-contiguous data access, many codes use manual pack loops for performance reasons. Programmers write accesspattern specific pack loops (e.g., do manual unrolling) for which compilers emit optimized code. In contrast, MPI implementations in use today interpret datatypes at pack time, resulting in high overheads. In this work we explore the eff… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
17
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 22 publications
(18 citation statements)
references
References 17 publications
1
17
0
Order By: Relevance
“…Although better results have been reported by, for instance, Hoefler and Gottlieb (2010) and Lu et al (2004), this inconsistency seems to hamper the widespread use of derived datatypes. This, despite a considerable amount of work by Byna et al (2006), Ross et al (2003), Schneider et al (2013) and Wu et al (2004) among others addressing the issue. Schneider et al (2013) also demonstrate runtime compilation for MPI datatypes.…”
Section: Background and Related Workmentioning
confidence: 97%
“…Although better results have been reported by, for instance, Hoefler and Gottlieb (2010) and Lu et al (2004), this inconsistency seems to hamper the widespread use of derived datatypes. This, despite a considerable amount of work by Byna et al (2006), Ross et al (2003), Schneider et al (2013) and Wu et al (2004) among others addressing the issue. Schneider et al (2013) also demonstrate runtime compilation for MPI datatypes.…”
Section: Background and Related Workmentioning
confidence: 97%
“…Although better results have been reported [3,7], this inconsistency seems to hamper the widespread use of derived datatypes. This, despite a considerable amount of work [2,12,13,15] addressing the issue. In [13], Schneider et al also demonstrate runtime compilation for MPI datatypes.…”
Section: Mpi Derived Datatypesmentioning
confidence: 99%
“…This, despite a considerable amount of work [2,12,13,15] addressing the issue. In [13], Schneider et al also demonstrate runtime compilation for MPI datatypes. Our approach is, we believe, more comprehensive than theirs.…”
Section: Mpi Derived Datatypesmentioning
confidence: 99%
“…MPI, for example, allows the specification of datatypes that simplify and optimize non-contiguous communications. We have shown in a previous study that runtime compilation techniques can speed up the packing of MPI DDTs by a factor of seven [19], and therefore make it competitive with manual packing. The proposed techniques in this work automatically overlap packing and communication to enable further optimization.…”
Section: O M I T T E D S O U T H E a S T And W E S T Unpack L Omentioning
confidence: 99%
“…Schneider et al [19] also optimized the packing of MPI derived datatypes. MPI DDTs are traditionally interpreted at runtime, which is often slower than manual pack loops written for a specific case and optimized by the compiler at compile time.…”
Section: Related Workmentioning
confidence: 99%