2023
DOI: 10.1109/tpami.2020.2999032
|View full text |Cite
|
Sign up to set email alerts
|

Second-Order Pooling for Graph Neural Networks

Abstract: Graph neural networks have achieved great success in learning node representations for graph tasks such as node classification and link prediction. Graph representation learning requires graph pooling to obtain graph representations from node representations. It is challenging to develop graph pooling methods due to the variable sizes and isomorphic structures of graphs. In this work, we propose to use second-order pooling as graph pooling, which naturally solves the above challenges. In addition, compared to … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
32
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 61 publications
(32 citation statements)
references
References 35 publications
0
32
0
Order By: Relevance
“…TU datasets. We follow the same settings in previous studies [9], [12] to perform 10-fold cross-validation for performance evaluation. To obtain graph node features, we adopt the GIN [2] and set the number of GIN layer as 5 in our experiments.…”
Section: A Comparison Results With the Existing Graph Classification ...mentioning
confidence: 99%
See 2 more Smart Citations
“…TU datasets. We follow the same settings in previous studies [9], [12] to perform 10-fold cross-validation for performance evaluation. To obtain graph node features, we adopt the GIN [2] and set the number of GIN layer as 5 in our experiments.…”
Section: A Comparison Results With the Existing Graph Classification ...mentioning
confidence: 99%
“…The most familiar averaging and summation operations in GNNs [2], [10], [11] belong to the flat graph pooling but only collect the first-order statistic information of the node representations and ignore the important higherorder statistics. Therefore, Wang et al [12] proposed a secondorder pooling framework as graph pooling and demonstrated the effectiveness and superiority of second-order statistics for graph neural networks. However, all these methods ignore non-Euclidean geometry characteristic of graphs, which will cause the massive information missing and downgrade the performance of graph representations.…”
Section: Graph Pooling Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…that outputs a summarized representation, in order to facilitate to combine it with our SSRead layer. We only consider the global aggregation functions officially implemented in PyTorch Geometric, but any other global aggregations, such as the second-order pooling [32], also can be combined with our SSRead layer to output the position-level representation.…”
Section: ) Implementation Detailsmentioning
confidence: 99%
“…GCN in conjunction with attention mechanism has proven its effectiveness in modeling long-range dependency. Focusing on the node's most relevant hidden representations in a graph, a multi-head-attention [51] is exploited to stabilize the learning task, leveraging the second-order hierarchical pooling [57]. In [34], a self-attention is introduced for hierarchical pooling to learn structural information using the node features and graph topology.…”
Section: Graph Convolutional Network (Gcn)mentioning
confidence: 99%