2020
DOI: 10.48550/arxiv.2012.12533
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Motif-Driven Contrastive Learning of Graph Representations

Abstract: Graph motifs are significant subgraph patterns occurring frequently in graphs, and they play important roles in representing the whole graph characteristics. For example, in chemical domain, functional groups are motifs that can determine molecule properties. Mining and utilizing motifs, however, is a non-trivial task for large graph datasets. Traditional motif discovery approaches rely on exact counting or statistical estimation, which are hard to scale for large datasets with continuous and high-dimension fe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
33
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(33 citation statements)
references
References 18 publications
0
33
0
Order By: Relevance
“…However, GCA is focused on network data and not suitable for molecular graphs. Instead of focusing on augmentation views, MICRO-Graph [46] proposed to contrast based on sub-graphs (motifs). GCC [24] proposed to use random walk to generate subgraphs and contrast between them.…”
Section: Related Workmentioning
confidence: 99%
“…However, GCA is focused on network data and not suitable for molecular graphs. Instead of focusing on augmentation views, MICRO-Graph [46] proposed to contrast based on sub-graphs (motifs). GCC [24] proposed to use random walk to generate subgraphs and contrast between them.…”
Section: Related Workmentioning
confidence: 99%
“…Furthermore, [39,28,41] proposed to use the mutual information maximization as the optimization objective for GNNs. Recently, more self-supervised tasks for GNNs have been proposed [14,15,52,51,32,10,54,30]. Based on how the self-supervised tasks are constructed, these models can be classified into two categories, namely contrastive models and predictive models.…”
Section: Self-supervised Learning Of Graphsmentioning
confidence: 99%
“…Based on how the self-supervised tasks are constructed, these models can be classified into two categories, namely contrastive models and predictive models. Contrastive models try to generate informative and diverse views from data instances and perform node-to-context [14], node-to-graph [39] or motif-to-graph [54] contrastive learning. On the other hand, predictive models are trained in a supervised fashion, where the labels are generated based on certain properties of the input graph data, i.e., node attributes [32], or by selecting certain parts of the graph [15,32].…”
Section: Self-supervised Learning Of Graphsmentioning
confidence: 99%
See 1 more Smart Citation
“…The current lack of unified evaluation criterion on graph embedding-based recommendation and conventional recommendation will lead to longstanding discussions on these controversies in the future, involving expanded perspectives from accuracy, scalability, extensibility, and explainability, as well as participated by interdisciplinary researchers ranging from mathematicians to data scientists. Developing both graph embedding-based recommendation and conventional recommendation is not contradictory, for the methods of analyzing graph topological characteristics behind conventional recommendation can inspire graph embedding-based recommendation in the utilization of such as subgraphs [64], motifs [65][66][67], and neighborhood [68][69][70] to promote embedding explainability [39] and recommendation performance. Meanwhile, graph embedding-based recommendation has pioneered novel recommendation scenarios including conversational recommender system (CRS) [71] and news recommendation [72], providing more promising application prospects for conventional recommendation.…”
Section: Introductionmentioning
confidence: 99%