2021
DOI: 10.48550/arxiv.2107.01410
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Maximum Entropy Weighted Independent Set Pooling for Graph Neural Networks

Abstract: In this paper, we propose a novel pooling layer for graph neural networks based on maximizing the mutual information between the pooled graph and the input graph. Since the maximum mutual information is difficult to compute, we employ the Shannon capacity of a graph as an inductive bias to our pooling method. More precisely, we show that the input graph to the pooling layer can be viewed as a representation of a noisy communication channel. For such a channel, sending the symbols belonging to an independent se… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 35 publications
0
4
0
Order By: Relevance
“…In this section, we perform the MEWISPool which is a structure-adaptive pooling method to select the crucial sub-structure of brain imaging for every individual. It is incorporated into GNNs in an end-to-end manner and enables effective hierarchical representations of graph 19 . Compared to other graph pooling methods [20][21] , it is designed to preserve the information and connectivity of the graph and reduce redundant information.…”
Section: Maximum Entropy Weighted Independent Set Pooling (Mewispool)mentioning
confidence: 99%
See 1 more Smart Citation
“…In this section, we perform the MEWISPool which is a structure-adaptive pooling method to select the crucial sub-structure of brain imaging for every individual. It is incorporated into GNNs in an end-to-end manner and enables effective hierarchical representations of graph 19 . Compared to other graph pooling methods [20][21] , it is designed to preserve the information and connectivity of the graph and reduce redundant information.…”
Section: Maximum Entropy Weighted Independent Set Pooling (Mewispool)mentioning
confidence: 99%
“…The node signals of the graph are the messages to be transmitted and the edges represent the symbols which might be confused at the output due to the presence of noise. This method does not consider the information relationship between nodes and neighbors, but tends to maximize the mutual information between the pooled nodes and the whole set of input nodes 19 .…”
Section: Maximum Entropy Weighted Independent Set Pooling (Mewispool)mentioning
confidence: 99%
“…As shown in table 2, the accuracy is better with the help of global representation. The highest average accuracy 0.865 is achieved by a GNNbased approach with maximum entropy weighted independent set, SetMEWISPool [28] plus global feature. For another graph kernelbased deep learning method, DDGK [1], the best performance is achieved with one-hot embeddings instead of hypergraphlet counting.…”
Section: Enzyme Classificationmentioning
confidence: 99%
“…There are several exceptional graph neural networks reported on the PROTEINS and ENZYMES dataset which have gained good scores [16][17][18] . The most prominent ones are hierarchical structure methods employing convolutional layers as well as messagepassing layers 19,20 .…”
Section: Introductionmentioning
confidence: 99%