Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery &Amp; Data Mining 2019
DOI: 10.1145/3292500.3330982
|View full text |Cite
|
Sign up to set email alerts
|

Graph Convolutional Networks with EigenPooling

Abstract: Graph neural networks, which generalize deep neural network models to graph structured data, have attracted increasing attention in recent years. They usually learn node representations by transforming, propagating and aggregating node features and have been proven to improve the performance of many graph related tasks such as node classification and link prediction. To apply graph neural networks for the graph classification task, approaches to generate the graph representation from node representations are d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
139
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 285 publications
(142 citation statements)
references
References 41 publications
0
139
0
Order By: Relevance
“…Recently, graph convolutional network (GCN) models [4,9,17,23,33,35] have been widely studied. For example, [3] first designs the graph convolution operation in Fourier domain by the graph Laplacian.…”
Section: Related Workmentioning
confidence: 99%
“…Recently, graph convolutional network (GCN) models [4,9,17,23,33,35] have been widely studied. For example, [3] first designs the graph convolution operation in Fourier domain by the graph Laplacian.…”
Section: Related Workmentioning
confidence: 99%
“…It is used to build hierarchical GNNs, where hierarchical graph pooling is used several times between GNN layers to gradually decrease the number of nodes. The most representative hierarchical graph pooling methods are DIFFPOOL [17], SORTPOOL [16], TOPKPOOL [18], SAG-POOL [19], and EIGENPOOL [20]. A straightforward way to use hierarchical graph pooling for graph representation learning is to reduce the number of nodes to one.…”
Section: Graph Pooling: Global Versus Hierarchicalmentioning
confidence: 99%
“…As discussed in Section 2.1, they are used in flat GNNs after all GNN layers and output the graph representation for the classifier. While flat GNNs outperform hierarchical GNNs in most benchmark datasets [12], developing hierarchical graph pooling is still desired, especially for large graphs [17], [19], [20]. Therefore, we explore a hierarchical graph pooling method based on second-order pooling.…”
Section: Multi-head Attentional Second-order Poolingmentioning
confidence: 99%
See 1 more Smart Citation
“…Inspired by the remarkable development of graph neural networks in various domains [13] [40] [27] [39], researchers have noticed the potentials of them for molecular property prediction. Generally, by treating the molecule as a graph, several graph neural networks have been applied [14,24,40] as an architecture that can directly deal with noneuclidean data like graphs. Variants of graph neural networks like MPNN [13], Schnet [30], can be applied for molecular properties prediction where they use nodes to represent atoms, and the edges are weighted by the distances between atoms.…”
Section: Related Workmentioning
confidence: 99%