2021
DOI: 10.1177/10943420211029299
|View full text |Cite
|
Sign up to set email alerts
|

EXAGRAPH: Graph and combinatorial methods for enabling exascale applications

Abstract: Combinatorial algorithms in general and graph algorithms in particular play a critical enabling role in numerous scientific applications. However, the irregular memory access nature of these algorithms makes them one of the hardest algorithmic kernels to implement on parallel systems. With tens of billions of hardware threads and deep memory hierarchies, the exascale computing systems in particular pose extreme challenges in scaling graph algorithms. The codesign center on combinatorial algorithms, ExaGraph, w… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(3 citation statements)
references
References 100 publications
0
3
0
Order By: Relevance
“…As the HPC and cloud computing communities increasingly rely on hardware specialization to improve performance, codesign approaches will support the development of accelerators (Lie 2021;Reuther et al 2021;Cortés et al 2021) for frequently used kernels in scientific modeling and AI/ML methods. New specialized accelerators may arise to support additional data science capabilities such as uncertainty quantification, streaming analytics, or graph analysis (Halappanavar et al 2021;Acer et al 2021). These future large-scale computing systems with extreme heterogeneity must be codesigned to support the increased computational and dataset sizes associated with Earth science predictability and scientific machine reasoning (Yang et al 2016;Zhang et al 2020;Yu et al 2022).…”
Section: Future System Conceptsmentioning
confidence: 99%
“…As the HPC and cloud computing communities increasingly rely on hardware specialization to improve performance, codesign approaches will support the development of accelerators (Lie 2021;Reuther et al 2021;Cortés et al 2021) for frequently used kernels in scientific modeling and AI/ML methods. New specialized accelerators may arise to support additional data science capabilities such as uncertainty quantification, streaming analytics, or graph analysis (Halappanavar et al 2021;Acer et al 2021). These future large-scale computing systems with extreme heterogeneity must be codesigned to support the increased computational and dataset sizes associated with Earth science predictability and scientific machine reasoning (Yang et al 2016;Zhang et al 2020;Yu et al 2022).…”
Section: Future System Conceptsmentioning
confidence: 99%
“…The authors have shown a 20Â speedup for the GraphSage benchmark (Hamilton et al, 2017) and a 36Â speedup on GAT benchmarks (Velickovic et al, 2018) In a more general setting, scaling of unstructured graph algorithms is an important problem. Within the Department of Energy, the ExaGraph Co-design Center is a component of the Exascale Computing Project, which aims to develop broadly applicable graph kernels for exascale computation (1 exaflop ¼ 10 18 flops) in applications spanning power grid, biology, chemistry, wind energy, and national security (Acer et al, 2021). The project aims to codesign algorithms, computing architecture, and hardware to build combinatorial kernels impacting general graph computation in addition to GNNs.…”
Section: Scalable Gnnsmentioning
confidence: 99%
“…In a more general setting, scaling of unstructured graph algorithms is an important problem. Within the Department of Energy, the ExaGraph Co-design Center is a component of the Exascale Computing Project, which aims to develop broadly applicable graph kernels for exascale computation (1 exaflop = 10 18 flops) in applications spanning power grid, biology, chemistry, wind energy, and national security (Acer et al, 2021). The project aims to co-design algorithms, computing architecture, and hardware to build combinatorial kernels impacting general graph computation in addition to graph neural networks.…”
Section: Scalable Gnnsmentioning
confidence: 99%