2021
DOI: 10.1109/tbdata.2021.3081431
|View full text |Cite
|
Sign up to set email alerts
|

Hyperbolic Graph Attention Network

Abstract: Graph neural network (GNN) has shown superior performance in dealing with graphs, which has attracted considerable research attention recently. However, most of the existing GNN models are primarily designed for graphs in Euclidean spaces. Recent research has proven that the graph data exhibits non-Euclidean latent anatomy. Unfortunately, there was rarely study of GNN in non-Euclidean settings so far. To bridge this gap, in this paper, we study the GNN with attention mechanism in hyperbolic spaces at the first… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
47
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 63 publications
(48 citation statements)
references
References 30 publications
1
47
0
Order By: Relevance
“…Some spatial methods focus on improving model capacity by introducing an attention mechanism to the graph domain, such as the Graph attention network (GAT), which adopts a self-attention mechanism to learn the weighting function [4]. Developments of GAT, such as Dual-primal graph convolutional network (DPGCN) [22] generalized GAT by using convolutions on nodes and edges, giving a better performance, Temporal graph attention network (TempGAN) learns node representations from continuous-time temporal graphs [23], and Hyperbolic graph attention network learns robust node representations of graphs in hyperbolic spaces [24]. The graph sample and aggregate method (GraphSage) [18], a nodebased spatial method, learns node, rather than graph, embeddings so it is graph-scale free and can be applied to large or evolving graphs.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Some spatial methods focus on improving model capacity by introducing an attention mechanism to the graph domain, such as the Graph attention network (GAT), which adopts a self-attention mechanism to learn the weighting function [4]. Developments of GAT, such as Dual-primal graph convolutional network (DPGCN) [22] generalized GAT by using convolutions on nodes and edges, giving a better performance, Temporal graph attention network (TempGAN) learns node representations from continuous-time temporal graphs [23], and Hyperbolic graph attention network learns robust node representations of graphs in hyperbolic spaces [24]. The graph sample and aggregate method (GraphSage) [18], a nodebased spatial method, learns node, rather than graph, embeddings so it is graph-scale free and can be applied to large or evolving graphs.…”
Section: Related Workmentioning
confidence: 99%
“…STC (conv only) or vanilla STC is based on Equation ( 16) to perform convolution only. STC-NARS is based on Equations (24)(25) to do graph representation learning on large heterogeneous graphs.…”
Section: Variantsmentioning
confidence: 99%
“…Hyperbolic geometry has received increasing attention in machine learning and network science communities due to its attractive properties for modeling data with latent hierarchies. It has been applied to neural networks for problems of computer vision, natural language processing [16,29,30,32], and graph embedding tasks [6,16,24,40]. In the graph embedding field, recent works including HGNN [24], HGCN [6], and HGAT [40] generalize the graph convolution into hyperbolic space (the name of these methods are from corresponding literature) by moving the aggregation operation to the tangent space, where the vector operations can be performed.…”
Section: Related Workmentioning
confidence: 99%
“…It has been applied to neural networks for problems of computer vision, natural language processing [16,29,30,32], and graph embedding tasks [6,16,24,40]. In the graph embedding field, recent works including HGNN [24], HGCN [6], and HGAT [40] generalize the graph convolution into hyperbolic space (the name of these methods are from corresponding literature) by moving the aggregation operation to the tangent space, where the vector operations can be performed. HGNN [24] focuses more on graph classification tasks and provides an extension to dynamic graph embeddings.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation