2020 IEEE/ACM 10th Workshop on Irregular Applications: Architectures and Algorithms (IA3) 2020
DOI: 10.1109/ia351965.2020.00011
|View full text |Cite
|
Sign up to set email alerts
|

DistDGL: Distributed Graph Neural Network Training for Billion-Scale Graphs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
92
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 138 publications
(92 citation statements)
references
References 7 publications
0
92
0
Order By: Relevance
“…For instance, GraphSAINT [Zeng et al, 2020] utilizes three samplers, i.e., node sampler, edge sampler, and random walk sampler, to sample nodes (or edges) and construct a subgraph in each batch. Moreover, subgraph sampling can be parallelized at the processing unit level with a training scheduler aided [Zeng et al, 2019], making the training efficient and easily scalable. Such parallelization takes good advantage of the property that subgraphs can be sampled independently.…”
Section: Graph-level Improvmentsmentioning
confidence: 99%
See 2 more Smart Citations
“…For instance, GraphSAINT [Zeng et al, 2020] utilizes three samplers, i.e., node sampler, edge sampler, and random walk sampler, to sample nodes (or edges) and construct a subgraph in each batch. Moreover, subgraph sampling can be parallelized at the processing unit level with a training scheduler aided [Zeng et al, 2019], making the training efficient and easily scalable. Such parallelization takes good advantage of the property that subgraphs can be sampled independently.…”
Section: Graph-level Improvmentsmentioning
confidence: 99%
“…SGCN formulates graph sparsification as an optimization problem and resolves it via an alternating direction method of multipliers (ADMM) approach[Boyd et al, 2011]. Additionally, GAUG[Zhao et al, 2021] proposes two variants: GAUG-M utilizes an edge predictor to acquire probabilities of edges in a graph, and modifies input graphs based on the predicted probabilities; GAUG-O integrates the edge predictor and GNN model to jointly promote edge prediction and model accuracy.…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…There are also a lot of works that use supervised learning and graph neural network (GNN) libraries [13,22,30,31,37,45,46,50,52] to solve graph related applications. Lately, a few researchers start to employ supervised GNN learning to solve combinatorial optimization problems.…”
Section: Related Workmentioning
confidence: 99%
“…For instance, sampling-based approaches aim at reducing the neighborhood size via layer sampling [2], [3], [9], clustering based sampling [5] and graph sampling [27] techniques; these prior works approach this problem purely from an algorithmic angle. A few recent works [23], [30] investigate the topic of distributed multi-GPU training of GNNs and achieve good parallel efficiency and memory scalability while using large GPU clusters.…”
Section: Introductionmentioning
confidence: 99%