2019
DOI: 10.1609/aaai.v33i01.33014602
|View full text |Cite
|
Sign up to set email alerts
|

Weisfeiler and Leman Go Neural: Higher-Order Graph Neural Networks

Abstract: In recent years, graph neural networks (GNNs) have emerged as a powerful neural architecture to learn vector representations of nodes and graphs in a supervised, end-to-end fashion. Up to now, GNNs have only been evaluated empirically-showing promising results. The following work investigates GNNs from a theoretical point of view and relates them to the 1-dimensional Weisfeiler-Leman graph isomorphism heuristic (1-WL). We show that GNNs have the same expressiveness as the 1-WL in terms of distinguishing non-is… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

7
968
0
4

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
1
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 974 publications
(979 citation statements)
references
References 24 publications
7
968
0
4
Order By: Relevance
“…Graph Neural Networks (gnns) learn graph level embeddings by aggregating node representations learned via convolving neighborhood information throughout the neural network's layers. This idea has been the basis of many popular neural networks and is as powerful as WL-Kernels for classification [14,26]. We refer interested readers to a comprehensive survey of these models [25].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Graph Neural Networks (gnns) learn graph level embeddings by aggregating node representations learned via convolving neighborhood information throughout the neural network's layers. This idea has been the basis of many popular neural networks and is as powerful as WL-Kernels for classification [14,26]. We refer interested readers to a comprehensive survey of these models [25].…”
Section: Related Workmentioning
confidence: 99%
“…In a more practical approach, graphs are first mapped into fixed dimensional feature vectors, where vector space-based algorithms are then employed. In a supervised setting, these feature vectors are learned through neural networks [14,25,26]. In unsupervised settings, the feature vectors are descriptive statistics of the graph such as average degree, the eigenspectrum, or spectra of sub-graphs of order at most k contained in the graph [7,11,17,18,22,23].…”
Section: Introductionmentioning
confidence: 99%
“…Proposed extensions include a pooling architecture that learns a soft clustering of the graph [59], or a two-tower model which frames graph similarity as link prediction between GCN representations [4]. Interestingly, it has been shown that many of these methods are not necessarily more expressive than the original Weisfeiler-Lehman subtree kernel itself [33].…”
Section: Supervised Graph Similaritymentioning
confidence: 99%
“…the problem of assigning a label to the entire graph). These supervised approaches [33,36,45,61] learn an intermediate representation of an entire graph as a precondition in order to solve the classification task. This learned representation can be used to compare similarity between graphs, but is heavily biased towards maximizing performance on the classification task of interest.…”
Section: Introductionmentioning
confidence: 99%
“…Other works [39,40] emphasise that GNNs can, at best, be as discriminative as the 1-WL (in terms of the ability to distinguish non-isomorphic graphs with the extracted features). However, the flexibility of GNNs is expected to enhance their representation power with respect to the problem at hand, and therefore, to enable overtake state-of-the-art performance.…”
Section: Molecular Graph Encodermentioning
confidence: 99%