2016
DOI: 10.1007/978-3-319-46079-6_5
|View full text |Cite
|
Sign up to set email alerts
|

The EPiGRAM Project: Preparing Parallel Programming Models for Exascale

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 10 publications
(8 citation statements)
references
References 31 publications
0
8
0
Order By: Relevance
“…Chapel [89] is a PGAS-based programming language by Cray Inc. UPC++ [16] is a library based on asynchronous PGAS, adding the ability to spawn and coordinate tasks. The EPiGRAM project [121] aims to improve the interoperability of MPI and PGAS.…”
Section: Mpi and Openmp According To Gropp And Snirmentioning
confidence: 99%
“…Chapel [89] is a PGAS-based programming language by Cray Inc. UPC++ [16] is a library based on asynchronous PGAS, adding the ability to spawn and coordinate tasks. The EPiGRAM project [121] aims to improve the interoperability of MPI and PGAS.…”
Section: Mpi and Openmp According To Gropp And Snirmentioning
confidence: 99%
“…Exascale systems force new requirements on programming systems to target platforms with hundreds of homogeneous and heterogeneous cores. Evolutionary models have been recently proposed for Exascale programming that extend or adapt traditional parallel programming models like MPI (e.g., EPiGRAM [15] that uses a library-based approach, Open MPI for Exascale in the ECP initiative), OpenMP (e.g., OmpSs [8] that exploits an annotation-based approach, the SOLLVE project), and MapReduce (e.g., Pig Latin [22] that implements a domain-specific complete language). These new frameworks limit the communication overhead in message passing paradigms or limit the synchronization control if a shared-memory model is used [11].…”
Section: Exascale Programming Systemsmentioning
confidence: 99%
“…Porting an MPI-based application to GPI-2 can be done incrementally, by identifying independent MPI communication sections and replacing them gradually with GPI-2 communication sections, taking advantage of the ability of GPI-2 to work in mixed-mode model with MPI (Markidis et al, 2016). An important rule to follow here is to preserve the application’s logic.…”
Section: State-of-the-art Interoperability Of Gaspi and Mpimentioning
confidence: 99%