2020
DOI: 10.48550/arxiv.2006.00144
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Understanding the Message Passing in Graph Neural Networks via Power Iteration Clustering

Xue Li,
Yuanzhi Cheng

Abstract: The mechanism of message passing in graph neural networks(GNNs) is still mysterious for the literature. No one, to our knowledge, has given another possible theoretical origin for GNNs apart from convolutional neural networks. Somewhat to our surprise, the message passing can be best understood in terms of the power iteration. By removing activation functions and layer weights of GNNs, we propose power iteration clustering (SPIC) models which are naturally interpretable and scalable. The experiment shows our m… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
1
1
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 7 publications
0
3
0
Order By: Relevance
“…Considering that the input of the MoRGH model is a heterogeneous graph, the process of message passing [45], which plays an important role in graph neural network learning, is different from homogeneous graphs and requires the notion of types. Types are crucial as different types have their own set of different data tensors in heterogeneous graphs.…”
Section: Graph Auto-encoder Creationmentioning
confidence: 99%
“…Considering that the input of the MoRGH model is a heterogeneous graph, the process of message passing [45], which plays an important role in graph neural network learning, is different from homogeneous graphs and requires the notion of types. Types are crucial as different types have their own set of different data tensors in heterogeneous graphs.…”
Section: Graph Auto-encoder Creationmentioning
confidence: 99%
“…MPNNs [47,14] have been used to predict the properties of molecules and materials [51,24,42,14,12,8,23,46,36,5,41,50,49,34], as well as to generate molecules and materials with desired properties [30,27]. There has been limited efforts in interpreting/explaining MPNNs or graph neural networks (GNNs) in general [46,28,17,22,25,52].…”
Section: Introductionmentioning
confidence: 99%
“…To handle networks with node attributes, the field of graph neural networks (GNNs) has expanded rapidly in recent years [47][48][49][50]. Existing GNN models belong to the fam-ily of message passing frameworks [51,52], which update information for each node by recursively aggregating messages from its immediate neighbors in the graph. GNNs are generally classified into two categories, as follows.…”
mentioning
confidence: 99%