2021
DOI: 10.48550/arxiv.2101.07730
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Unifying Generative Model for Graph Learning Algorithms: Label Propagation, Graph Convolutions, and Combinations

Abstract: Semi-supervised learning on graphs is a widely applicable problem in network science and machine learning. Two standard algorithms -label propagation and graph neural networks -both operate by repeatedly passing information along edges, the former by passing labels and the latter by passing node features, modulated by neural networks. These two types of algorithms have largely developed separately, and there is little understanding about the structure of network data that would make one of these approaches wor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
1

Relationship

2
3

Authors

Journals

citations
Cited by 10 publications
(16 citation statements)
references
References 35 publications
0
16
0
Order By: Relevance
“…The aggregation process can be usually understood as feature smoothing [20,21,16,42]. Hence, several recent works claim [41,40,4], assume [12,35,38] or remark upon [1,22,14] GNN models homophily-reliance or unsuitability in capturing heterophily.…”
Section: Related Workmentioning
confidence: 99%
“…The aggregation process can be usually understood as feature smoothing [20,21,16,42]. Hence, several recent works claim [41,40,4], assume [12,35,38] or remark upon [1,22,14] GNN models homophily-reliance or unsuitability in capturing heterophily.…”
Section: Related Workmentioning
confidence: 99%
“…election. This dataset comes from the 2016 presidential election, where nodes are U.S. counties, and edges connect counties with the strongest Facebook social ties [26]. Each node has county-level demographic features (e.g., median income) and social network user features (e.g., fraction of friends within 50 miles).…”
Section: Methodsmentioning
confidence: 99%
“…Besides MRFs, other GNN approaches have used training labels at inference in graph-based learning. These include diffusions on training labels for data augmentation [23] and post-processing techniques [24,25,26]. Similar in spirit, smoothing techniques model positive label correlations in neighboring nodes [27,28].…”
Section: Mrfs With Gnns a Few Approaches Combine Mrfs And Graph Neura...mentioning
confidence: 99%

Graph Belief Propagation Networks

Jia,
Baykal,
Potluru
et al. 2021
Preprint
Self Cite
“…In contrast, diffusion-like methods work precisely because of homophily and are typically fast. In the simple case of graphs, combining these two ideas has led to several recent advances [19,15,16].…”
Section: Introductionmentioning
confidence: 99%