2022
DOI: 10.48550/arxiv.2202.04579
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Neural Sheaf Diffusion: A Topological Perspective on Heterophily and Oversmoothing in GNNs

Abstract: Cellular sheaves equip graphs with "geometrical" structure by assigning vector spaces and linear maps to nodes and edges. Graph Neural Networks (GNNs) implicitly assume a graph with a trivial underlying sheaf. This choice is reflected in the structure of the graph Laplacian operator, the properties of the associated diffusion equation, and the characteristics of the convolutional models that discretise this equation. In this paper, we use cellular sheaf theory to show that the underlying geometry of the graph … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 13 publications
(22 reference statements)
0
3
0
Order By: Relevance
“…The maximum accuracies demonstrated by our own implementation and those reported in other papers are shown in bold. The rows marked with "*" indicate the accuracies from papers [16], [37], [40], [41]. Figure 5 shows the optimal parameter settings used in MSI-H2GCN-2, which exhibited the highest average accuracy, as shown in Table 2.…”
Section: Resultsmentioning
confidence: 99%
“…The maximum accuracies demonstrated by our own implementation and those reported in other papers are shown in bold. The rows marked with "*" indicate the accuracies from papers [16], [37], [40], [41]. Figure 5 shows the optimal parameter settings used in MSI-H2GCN-2, which exhibited the highest average accuracy, as shown in Table 2.…”
Section: Resultsmentioning
confidence: 99%
“…The magnetic Laplacian can be considered a special case of the connection Laplacian, which can be used for vector diffusion maps [196]. Moreover, the connection Laplacian is used for formulating Sheaf neural networks [197,198], which are a new generation of neural networks obtaining excellent performance on several machine learning tasks. Finally, the non-backtracking matrix that identifies non-backtracking cycles and efficiently detects the network communities [199], has been recently proposed for embedding oriented edges (non-backtracking embedding dimensions) [200].…”
Section: Network Embeddings Using Magnetic and Connection Laplaciansmentioning
confidence: 99%
“…Among the more recent developments, we also found the constructions of Hansen and Gebhart [2020] and Bodnar et al [2022] remarkable in that they imported the concept of sheaves into the setting of graph neural networks. This seems to be a natural approach that has the potential to produce more flexible models.…”
Section: Introductionmentioning
confidence: 96%