2021
DOI: 10.48550/arxiv.2105.10862
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Hypergraph Pre-training with Graph Neural Networks

Abstract: Despite the prevalence of hypergraphs in a variety of high-impact applications, there are relatively few works on hypergraph representation learning, most of which primarily focus on hyperlink prediction, and often restricted to the transductive learning setting. Among others, a major hurdle for effective hypergraph representation learning lies in the label scarcity of nodes and/or hyperedges. To address this issue, this paper presents an end-to-end, bi-level pre-training strategy with Graph Neural Networks fo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
5
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(5 citation statements)
references
References 31 publications
0
5
0
Order By: Relevance
“…While self-supervised training has been extensively studied for graphs with pairwise links [16][17][18][19][20][21][22][23], few works have explored it for hypergraph representation learning [27]. Our focus is on leveraging the attention mechanism [28] to predict hyperedges with arbitrary lengths and capture the indecomposiblity of a hyperedge, allowing PhyGCN to extract useful information from complex hypergraph structures.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…While self-supervised training has been extensively studied for graphs with pairwise links [16][17][18][19][20][21][22][23], few works have explored it for hypergraph representation learning [27]. Our focus is on leveraging the attention mechanism [28] to predict hyperedges with arbitrary lengths and capture the indecomposiblity of a hyperedge, allowing PhyGCN to extract useful information from complex hypergraph structures.…”
Section: Discussionmentioning
confidence: 99%
“…While SSL methods have been proposed for GNNs [16][17][18][19][20][21][22][23], their application to hypergraph learning is underexplored. Noteworthy works [24][25][26] have targeted specific applications such as recommender systems, while others [27] propose pre-training GNNs on hypergraphs, primarily focusing on hyperedge prediction. A detailed discussion of related works can be found in Supplementary Note A.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Self-supervised hypergraph learning. Recently, there have been a handful of works to study self-supervised learning on hypergraphs [14]- [16], [40], [42]. HyperGene [42] adopts bi-level (node-and hyperedge-level) self-supervised tasks to effectively learn group-wise relations.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, there have been a handful of works to study self-supervised learning on hypergraphs [14]- [16], [40], [42]. HyperGene [42] adopts bi-level (node-and hyperedge-level) self-supervised tasks to effectively learn group-wise relations. However, it adopts a clique expansion to transform a hypergraph into a simple graph, which incurs a significant loss of high-order information and does not employ contrastive learning.…”
Section: Introductionmentioning
confidence: 99%