2022
DOI: 10.1109/tkde.2022.3172903
|View full text |Cite
|
Sign up to set email alerts
|

Graph Self-Supervised Learning: A Survey

Abstract: Deep learning on graphs has attracted significant interests recently. However, most of the works have focused on (semi-) supervised learning, resulting in shortcomings including heavy label reliance, poor generalization, and weak robustness. To address these issues, self-supervised learning (SSL), which extracts informative knowledge through well-designed pretext tasks without relying on manual labels, has become a promising and trending learning paradigm for graph data. Different from SSL on other domains lik… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
81
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
4
1

Relationship

2
8

Authors

Journals

citations
Cited by 219 publications
(81 citation statements)
references
References 127 publications
0
81
0
Order By: Relevance
“…Pre-training of GNNs. Various recent works have formulated methods for pre-training using graph data [39,27,75,31], rather than 3D point clouds of atom nuclei as in this paper. Approaches based on contrastive methods rely on learning representations by contrasting different views of the input graph [64,69,79,37], or bootstrapping [65].…”
Section: Related Workmentioning
confidence: 99%
“…Pre-training of GNNs. Various recent works have formulated methods for pre-training using graph data [39,27,75,31], rather than 3D point clouds of atom nuclei as in this paper. Approaches based on contrastive methods rely on learning representations by contrasting different views of the input graph [64,69,79,37], or bootstrapping [65].…”
Section: Related Workmentioning
confidence: 99%
“…Graph Self-Supervised Learning: In graphs, recent works have explored several paradigms for self-supervised learning: see [62] for an up-to-date survey. Graph pre-text tasks are often reminiscent of image in-painting tasks [63], and seek to complete masked graphs and/or node features ( [64,13]).…”
Section: G Related Workmentioning
confidence: 99%
“…Graph Neural Networks. Over the last decade, hundreds of GNN methods, libraries, and software have been developed for semisupervised, unsupervised, and self-supervised learning on graphs [1,[13][14][15][16][17][18]. Their success in graph learning tasks and their limitations such as over-smoothing and neighborhood explosion problems are also well-documented in the literature [19][20][21].…”
Section: Related Workmentioning
confidence: 99%