Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence 2022
DOI: 10.24963/ijcai.2022/293
|View full text |Cite
|
Sign up to set email alerts
|

RAW-GNN: RAndom Walk Aggregation based Graph Neural Network

Abstract: Sequences of group interactions, such as emails, online discussions, and co-authorships, are ubiquitous; and they are naturally represented as a stream of hyperedges (i.e., sets of nodes). Despite its broad potential applications, anomaly detection in hypergraphs (i.e., sets of hyperedges) has received surprisingly little attention, compared to anomaly detection in graphs. While it is tempting to reduce hypergraphs to graphs and apply existing graph-based methods, according to our experiments, taking higher-o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 17 publications
(6 citation statements)
references
References 0 publications
0
6
0
Order By: Relevance
“…We conducted full-supervised node classification tasks and chose node classification accuracy as the metric to evaluate all models. Following the approach in [10,37], we performed random training/validation/testing splits on all datasets, allocating 48% of the nodes for training, 32% for validation, and the remaining nodes for testing. We generated 10 random splits for all datasets and applied the same splits to all models.…”
Section: Methodsmentioning
confidence: 99%
“…We conducted full-supervised node classification tasks and chose node classification accuracy as the metric to evaluate all models. Following the approach in [10,37], we performed random training/validation/testing splits on all datasets, allocating 48% of the nodes for training, 32% for validation, and the remaining nodes for testing. We generated 10 random splits for all datasets and applied the same splits to all models.…”
Section: Methodsmentioning
confidence: 99%
“…GNN-BC ( Yang et al, 2022 ) proposes an innovative graph neural network architecture that maps node attributes and topological structures to distinct representations, introducing exclusivity to reduce redundancy between these two representations. RAW-GNN ( Jin et al, 2022 ) presents a graph neural network framework based on random walk aggregation, utilizing breadth-first and depth-first random walks to gather homogeneous and heterogeneous information. LGLP ( Cai et al, 2020 ) transforms the graph link prediction problem into a node classification problem in a line graph, effectively learning features of the target links.…”
Section: Related Workmentioning
confidence: 99%
“…H2GCN (Zhu et al, 2020) is one of the first methods shown to work in both kinds of datasets. RAW-GNN (Jin et al, 2022a) is a random-walk-based GCN that exploits both homophily and heterophily by doing random walks and aggregations in two ways: breadth-first for homophily and depth-first for heterophily. CPGNN ) is a GCN-based architecture that uses a compatibility matrix for modeling the heterophily or homophily level in the graph, which can be learned in an end-to-end fashion, enabling it to go beyond the assumption of strong homophily.…”
Section: G Further Related Workmentioning
confidence: 99%
“…Currently, many works that address heterophily can be classified into two categories concerning scale. On the one hand, recent successful architectures (in terms of accuracy) (Jin et al, 2022a;Di Giovanni et al, 2022;Zheng et al, 2022b;Luan et al, 2021;Chien et al, 2020;Lei et al, 2022) that address heterophily resemble GCNs in terms of design and thus suffer from the same scalability issues. On the other hand, shallow or node-level models (see, e.g., (Lim et al, 2021;Zhong et al, 2022)), i.e., models that are treating graph data as tabular data and do not involve propagations during training, has shown a lot of promise for large heterophilous graphs.…”
Section: Introductionmentioning
confidence: 99%