2021
DOI: 10.48550/arxiv.2110.05292
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Understanding Pooling in Graph Neural Networks

Abstract: Inspired by the conventional pooling layers in convolutional neural networks, many recent works in the field of graph machine learning have introduced pooling operators to reduce the size of graphs. The great variety in the literature stems from the many possible strategies for coarsening a graph, which may depend on different assumptions on the graph structure or the specific downstream task. In this paper we propose a formal characterization of graph pooling based on three main operations, called selection, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 33 publications
0
3
0
Order By: Relevance
“…Following the pooling paradigm introduced in [19] for GNNs, we propose to model the layer in (6) as the composition of three operations: a local aggregation step, a selection step, and a reduction step. The local aggregation step is responsible for providing summary signals Z ∈ R E×F of the input signals Z ∈ R E×F leveraging the connectivity induced by the complex X .…”
Section: Design Of Simplicial Pooling Mappingmentioning
confidence: 99%
See 1 more Smart Citation
“…Following the pooling paradigm introduced in [19] for GNNs, we propose to model the layer in (6) as the composition of three operations: a local aggregation step, a selection step, and a reduction step. The local aggregation step is responsible for providing summary signals Z ∈ R E×F of the input signals Z ∈ R E×F leveraging the connectivity induced by the complex X .…”
Section: Design Of Simplicial Pooling Mappingmentioning
confidence: 99%
“…The goal of this work is to introduce pooling strategies for SCNs. Taking inspiration from the select-reduceconnect (SRC) paradigm [19], we introduce a general simplicial pooling layer that comprises three steps: i) a local aggregation step responsible for providing a meaningful summary of the input signals; ii) a selection step responsible for selecting a proper subset of simplices; finally, iii) a reduction step that downsamples the input complex and the aggregated signals of step i) based on the simplices selected in step ii). By tailoring steps ii) and iii), we introduce four different simplicial pooling layers that generalize the well-known graph pooling strategies.…”
Section: Introductionmentioning
confidence: 99%
“…Proof To prove the theorem, we first prove the average pooling can reduce the distance, and in the second step, we prove the weighted by attentiveness pooling will construct a new node with larger attentiveness. By the definition, we have: (10) where the first term equals to l 2 (x g W 1 , avg(x i )W 2 ) and the second term equals to avg(l 2 (x i W 2 , avg(x j )W 2 )), which are thus both non-negative. For shallow GNNs, the node features are usually different, leading the second term R.H.S larger than 0.…”
Section: Theoretical Analysismentioning
confidence: 99%