2022
DOI: 10.48550/arxiv.2201.07858
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Decoupling the Depth and Scope of Graph Neural Networks

Abstract: State-of-the-art Graph Neural Networks (GNNs) have limited scalability with respect to the graph and model sizes. On large graphs, increasing the model depth often means exponential expansion of the scope (i.e., receptive field). Beyond just a few layers, two fundamental challenges emerge: 1. degraded expressivity due to oversmoothing, and 2. expensive computation due to neighborhood explosion. We propose a design principle to decouple the depth and scope of GNNs -to generate representation of a target entity … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
references
References 20 publications
0
0
0
Order By: Relevance