2020
DOI: 10.48550/arxiv.2007.13828
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

GRIP: A Graph Neural Network Accelerator Architecture

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
1

Year Published

2021
2021
2022
2022

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(18 citation statements)
references
References 0 publications
0
17
1
Order By: Relevance
“…Most of these works accelerate the inference process or a certain phase in the training of GCN leveraging hardware characteristics and specially designed tools [74][75][76][77][78][79][80].…”
Section: Challenges and Future Directionsmentioning
confidence: 99%
“…Most of these works accelerate the inference process or a certain phase in the training of GCN leveraging hardware characteristics and specially designed tools [74][75][76][77][78][79][80].…”
Section: Challenges and Future Directionsmentioning
confidence: 99%
“…EnGN [16] GCN [9], GraphSage-Max [14], GatedGCN [11], GRN [16], R-GCN [15] HyGCN [17] GCN, GraphSage-Mean [14], GIN [24], DiffPool [12] Auten et al [19] GCN, GAT [13], PGNN [19] AWB-GCN [18] GCN GRIP [20] GCN, GraphSage-Max, GIN, GatedGCN…”
Section: Accelerators Algorithms Supportedmentioning
confidence: 99%
“…Hardware acceleration of GNN inference, and graph processing in general, has been an active area of study (Besta et al, 2019;Gui et al, 2019;Ming Xiong, 2020). Nurvitadhi et al (2014); Ozdal et al (2016); Zeng and Prasanna (2020); Yan et al (2020); Auten et al (2020); Geng et al (2020); Kiningham et al (2020) describe various other examples of GNN acceleration architectures. While these frameworks are applicable to various graph processing tasks, they may not apply to the strict latency requirements of the LHC trigger and they typically require the user to specify the design in a highly specialized format.…”
Section: Related Workmentioning
confidence: 99%