2022
DOI: 10.1109/jas.2021.1004311
|View full text |Cite
|
Sign up to set email alerts
|

Sampling Methods for Efficient Training of Graph Convolutional Networks: A Survey

Abstract: Graph Convolutional Networks (GCNs) have received significant attention from various research fields due to the excellent performance in learning graph representations. Although GCN performs well compared with other methods, it still faces challenges. Training a GCN model for large-scale graphs in a conventional way requires high computation and memory costs. Therefore, motivated by an urgent need in terms of efficiency and scalability in training GCN, sampling methods are proposed and achieve a significant ef… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 75 publications
(9 citation statements)
references
References 67 publications
0
9
0
Order By: Relevance
“…To integrate information of transcriptomic profiles and spatial coordinates, PAST identifies k nearest neighbors ( k -NN) for each spot using spatial coordinates in a Euclidean space, and adopts GCNs to aggregate spatial patterns from each spot’s neighbors. Specifically, the self-attention mechanism 21 , which fits the relationship between words well in machine translation tasks 23 , is used to model local spatial patterns between neighboring spots, while the ripple walk sampler 22 , which enables efficient subgraph-based training for large and deep GCNs 24 , is used to achieve better scalability on large-scale ST data and preserve global spatial patterns simultaneously. PAST also restricts the distance of latent embeddings between neighbors through metric learning 25 , the insight of which is that spatially close spots are more likely to be positive pairs to show similar latent patterns (Methods).…”
Section: Resultsmentioning
confidence: 99%
“…To integrate information of transcriptomic profiles and spatial coordinates, PAST identifies k nearest neighbors ( k -NN) for each spot using spatial coordinates in a Euclidean space, and adopts GCNs to aggregate spatial patterns from each spot’s neighbors. Specifically, the self-attention mechanism 21 , which fits the relationship between words well in machine translation tasks 23 , is used to model local spatial patterns between neighboring spots, while the ripple walk sampler 22 , which enables efficient subgraph-based training for large and deep GCNs 24 , is used to achieve better scalability on large-scale ST data and preserve global spatial patterns simultaneously. PAST also restricts the distance of latent embeddings between neighbors through metric learning 25 , the insight of which is that spatially close spots are more likely to be positive pairs to show similar latent patterns (Methods).…”
Section: Resultsmentioning
confidence: 99%
“…Due to its unique characteristics, GCN is able to capture features more comprehensively. Therefore, scholars have turned their attention to GCN and used effective training methods to improve their efficiency (Liu et al 2021). Bosselut et al ( 2019) with the COMET model, natural language can be used to generate diverse general knowledge.…”
Section: Event Relationship Extractionmentioning
confidence: 99%
“…For a homogeneous graph, GNNs are widely used to learn node representation from the graph structure. The pioneer GCN (Kipf and Welling 2017) proposes a multi-layer network following a layer-wise propagation rule, where the l th layer learns an embedding vector h (Hamilton, Ying, and Leskovec 2017) improves the scalability for large graphs by introducing mini-batch training and neighbor sampling (Liu et al 2022a). GAT (Veličković et al 2018) introduces the attention mechanism to encourage the model to focus on the most important part of neighbors.…”
Section: Related Workmentioning
confidence: 99%