2020
DOI: 10.1609/aaai.v34i04.5747
|View full text |Cite
|
Sign up to set email alerts
|

Measuring and Relieving the Over-Smoothing Problem for Graph Neural Networks from the Topological View

Abstract: Graph Neural Networks (GNNs) have achieved promising performance on a wide range of graph-based tasks. Despite their success, one severe limitation of GNNs is the over-smoothing issue (indistinguishable representations of nodes in different classes). In this work, we present a systematic and quantitative study on the over-smoothing issue of GNNs. First, we introduce two quantitative metrics, MAD and MADGap, to measure the smoothness and over-smoothness of the graph nodes representations, respectively. Then, we… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

6
354
0
2

Year Published

2020
2020
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 691 publications
(362 citation statements)
references
References 11 publications
6
354
0
2
Order By: Relevance
“…(2) As shown in Figure 5g, it seems that the over-smoothing problem still exists in MADReg. Similar conclusions can be found in the original paper of MADReg [30]. Specifically, from Table 5 in [30], we can find that although MADReg can improve the performance of the standard GNNs, the performance of a GNN with the MADReg in a deep version is far away from the performance of that in a shallow version.…”
Section: Evaluation Of Rejection Mechanismsupporting
confidence: 87%
See 3 more Smart Citations
“…(2) As shown in Figure 5g, it seems that the over-smoothing problem still exists in MADReg. Similar conclusions can be found in the original paper of MADReg [30]. Specifically, from Table 5 in [30], we can find that although MADReg can improve the performance of the standard GNNs, the performance of a GNN with the MADReg in a deep version is far away from the performance of that in a shallow version.…”
Section: Evaluation Of Rejection Mechanismsupporting
confidence: 87%
“…Normalizing the representations of nodes can prevent these representations from being too similar. MADReg [30] provides a topological view of the over-smoothing problem and brings a MADReg loss to avoid the representations of the distant nodes to be similar to the representations of the neighbor nodes. Experimental results (see Section 6.5) show MADReg can only relieve rather than prevent this problem.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…While the idea of learning hierarchical graph representations makes sense, hierarchical GNNs do not consistently outperform flat GNNs [19]. In addition, with advanced techniques like jumping knowledge networks (JK-Net) [11] to address the over-smoothing problem of GNN layers [22], flat GNNs can go deeper and achieve better performance than hierarchical GNNs [12].…”
Section: Graph Pooling: Global Versus Hierarchicalmentioning
confidence: 99%