2020
DOI: 10.48550/arxiv.2006.07988
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Adaptive Universal Generalized PageRank Graph Neural Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
41
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 24 publications
(41 citation statements)
references
References 0 publications
0
41
0
Order By: Relevance
“…There are also works (Zhu et al, 2020b;Bo et al, 2021;Chien et al, 2020;Yan et al, 2021;Suresh et al, 2021;Pei et al, 2020;Dong et al, 2021;Lim et al, 2021;Yang et al, 2021;Luan et al, 2021;Zhu et al, 2020a;Liu et al, 2021) that extend GNNs to heterophilous graphs. Some methods propose to leverage both low-pass and high-pass convolutional filters in neighborhood aggregation.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…There are also works (Zhu et al, 2020b;Bo et al, 2021;Chien et al, 2020;Yan et al, 2021;Suresh et al, 2021;Pei et al, 2020;Dong et al, 2021;Lim et al, 2021;Yang et al, 2021;Luan et al, 2021;Zhu et al, 2020a;Liu et al, 2021) that extend GNNs to heterophilous graphs. Some methods propose to leverage both low-pass and high-pass convolutional filters in neighborhood aggregation.…”
Section: Related Workmentioning
confidence: 99%
“…To generalize GNNs to heterophilous graphs, some recent works (Zhu et al, 2020b;Bo et al, 2021;Chien et al, 2020) have been proposed to leverage high-pass convolutional filters and multi-hop neighbors to address the heterophily issue. On the one hand, the high-pass filters can be used to push away a node's feature vector from its neighbors' while the low-pass filters used by traditional GNNs do the opposite.…”
Section: Introductionmentioning
confidence: 99%
“…The LGC algorithm arising from our analysis smooths features of nodes over the graph, using the output of label propagation on the features as input to a linear model. The idea of feature smoothing arises in many contexts for understanding GNNs [27,37,38,39], although some studies claim that "over-smoothing" is problematic [40,41,42]. Our model provides a statistical view of feature smoothing and a principled way to find an optimal smoothing level, which we can interpret as a balance between the effects of homophily and noise.…”
Section: The Present Workmentioning
confidence: 99%
“…Equations (2.22) and (2.24) both highlight how each column of X is smooth over the graph, and the parameter ω controls the amount of smoothing, as dictated by the generative data model. From the perspective of graph signal processing, similar smoothing has been viewed as a low-pass filter for features [56,36], and feature smoothing is a heuristic for several graph-based learning methods [27,37,38,39]. Our model puts these ideas on firm ground.…”
mentioning
confidence: 99%
“…The central part of our algorithm is a new random-walk-based pooling mechanism called WalkPool which may be interepreted as a learnable version of topological heuristics (Figure 1). Our algorithm is universal in the sense of Chien et al (2020) in that it models both heterophilic and homophilic graphs unlike hardcoded heuristics or standard GNNs which work better on homophilic graphs. WalkPool achieves state-of-the-art link prediction results on all common benchmark datasets, sometimes by a significant margin, even on datasets like USAir (Batagelj & Mrvar, 2006) where the previous state of the art is as high as 97%.…”
Section: Completed Graphmentioning
confidence: 99%