2023
DOI: 10.48550/arxiv.2302.07090
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Complete Expressiveness Hierarchy for Subgraph GNNs via Subgraph Weisfeiler-Lehman Tests

Abstract: Recently, subgraph GNNs have emerged as an important direction for developing expressive graph neural networks (GNNs). While numerous architectures have been proposed, so far there is still a limited understanding of how various design paradigms differ in terms of expressive power, nor is it clear what design principle achieves maximal expressiveness with minimal architectural complexity. Targeting these fundamental questions, this paper conducts a systematic study of general node-based subgraph GNNs through t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
16
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(17 citation statements)
references
References 18 publications
1
16
0
Order By: Relevance
“…First, it connects spectral invariant GNNs and GTs with the seemingly unrelated research direction of Subgraph GNNs (Cotta et al, 2021;Bevilacqua et al, 2022;Frasca et al, 2022;Qian et al, 2022;Zhao et al, 2022) -a line of research studying expressive GNNs from a structural and permutation symmetry perspective. Second, combined with recent results (Frasca et al, 2022;Zhang et al, 2023a), it implies that EPNN is strictly bounded by 3-WL. As an implication, bounding previously proposed spectral invariant methods by EPNN would readily indicate that they are all strictly less expressive than 3-WL.…”
Section: Introductionsupporting
confidence: 60%
See 1 more Smart Citation
“…First, it connects spectral invariant GNNs and GTs with the seemingly unrelated research direction of Subgraph GNNs (Cotta et al, 2021;Bevilacqua et al, 2022;Frasca et al, 2022;Qian et al, 2022;Zhao et al, 2022) -a line of research studying expressive GNNs from a structural and permutation symmetry perspective. Second, combined with recent results (Frasca et al, 2022;Zhang et al, 2023a), it implies that EPNN is strictly bounded by 3-WL. As an implication, bounding previously proposed spectral invariant methods by EPNN would readily indicate that they are all strictly less expressive than 3-WL.…”
Section: Introductionsupporting
confidence: 60%
“…Our first theoretical result establishes a tight expressiveness upper bound for EPNN, showing that it is strictly less expressive than an important class of Subgraph GNNs proposed in Zhang et al (2023a), called PSWL. This observation is intriguing for two reasons.…”
Section: Introductionmentioning
confidence: 84%
“…One also has GNNs based on spectral graph information such as, e.g., (Bruna et al, 2014;Defferrard et al, 2016;Gama et al, 2019;Kipf and Welling, 2017;Levie et al, 2019;Monti et al, 2017;Balcilar et al, 2021b). Some GNN architectures can employ vertex identifiers (Murphy et al, 2019;Vignac et al, 2020), use random features (Abboud et al, 2021;Dasoulas et al, 2020;Sato et al, 2021), equivariant graph polynomials (Puny et al, 2023), homomorphism and subgraph counts (Barceló et al, 2021;Bouritsas et al, 2020;Nguyen and Maehara, 2020), simplicial (Bodnar et al, 2021b) and cellular complexes (Bodnar et al, 2021a), persistent homology (Horn et al, 2022), random walks (Tönshoff et al, 2021;Martinkus et al, 2022), graph decompositions (Talak et al, 2021), relational , distance (Li et al, 2020) and directional information (Beaini et al, 2021), subgraph information Cotta et al, 2021;Feng et al, 2022;Huang et al, 2023;Papp et al, 2021;Qian et al, 2022;Thiede et al, 2021;Wijesinghe and Wang, 2022;You et al, 2021;Zhang and Li, 2021;Zhao et al, 2022;Zhang et al, 2023a), and biconnectivity (Zhang et al, 2023b). Examples of graph neural network architectures using higher-order p-vertex embeddings for p ≥ 2 are e.g., (Azizian and Lelarge, 2021;…”
Section: Embedding Methodsmentioning
confidence: 99%
“…NGNN encodes nodes' local subgraph information as their initial embeddings, rather than subtree information, which makes it strictly more powerful than MPNNs such as GIN (Xu et al, 2018). However, Zhang et al (2023) pointed out that NGNN is a relatively weaker subgraph GNN in learning topological graph structures than Qian et al (2022); Zhao et al (2021); Bevilacqua et al (2021); Frasca et al (2022), since it does not contain cross-subgraph operations which pass messages across different subgraphs.…”
Section: Nested Graph Neural Networkmentioning
confidence: 97%
“…node reordering. However, it has been shown that such functions are not universal (Xu et al, 2018;Morris et al, 2019), leading to the development of more expressive GNN frameworks (Maron et al, 2018;Morris et al, 2019;Zhang et al, 2023). Similar challenges arise in the geometric setting, where models must additionally respect the Euclidean symmetry, and achieving universality becomes a non-trivial task as well.…”
Section: Introductionmentioning
confidence: 99%