2021
DOI: 10.48550/arxiv.2103.10836
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

GNNerator: A Hardware/Software Framework for Accelerating Graph Neural Networks

Abstract: Graph Neural Networks (GNNs) apply deep learning to inputs represented as graphs. They use fully-connected layers to extract features from the nodes/edges of a graph and aggregate these features using message passing between nodes, thereby combining two distinct computational patterns: dense, regular computations and sparse, irregular computations. To address the computational challenges posed by GNNs, we propose GNNE R A T O R, an accelerator with heterogeneous compute engines optimized for these two patterns… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 12 publications
0
1
0
Order By: Relevance
“…Narrow shard based data reuse. To further reduce DRAM access, we explore data reuse through sharding strategies [21], [43], [66], [67]. In Figure 7, DIMM-0 computes partial reduction of destination vertexes v 1 -v 8 .…”
Section: Sourcementioning
confidence: 99%
“…Narrow shard based data reuse. To further reduce DRAM access, we explore data reuse through sharding strategies [21], [43], [66], [67]. In Figure 7, DIMM-0 computes partial reduction of destination vertexes v 1 -v 8 .…”
Section: Sourcementioning
confidence: 99%