ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2019
DOI: 10.1109/icassp.2019.8682836
|View full text |Cite
|
Sign up to set email alerts
|

A Recurrent Graph Neural Network for Multi-relational Data

Abstract: The era of "data deluge" has sparked the interest in graph-based learning methods in a number of disciplines such as sociology, biology, neuroscience, or engineering. In this paper, we introduce a graph recurrent neural network (GRNN) for scalable semisupervised learning from multi-relational data. Key aspects of the novel GRNN architecture are the use of multi-relational graphs, the dynamic adaptation to the different relations via learnable weights, and the consideration of graph-based regularizers to promot… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
19
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 30 publications
(19 citation statements)
references
References 15 publications
0
19
0
Order By: Relevance
“…The graph convolutional NN (GCNN) [21], [22] learning approaches are designed to process data defined over graphs. These models also lead to sparsification of the weight matrices connecting the hidden layers in the neural network.…”
Section: Graph-pruned Neural Network For Dssementioning
confidence: 99%
See 2 more Smart Citations
“…The graph convolutional NN (GCNN) [21], [22] learning approaches are designed to process data defined over graphs. These models also lead to sparsification of the weight matrices connecting the hidden layers in the neural network.…”
Section: Graph-pruned Neural Network For Dssementioning
confidence: 99%
“…2b are constrained to be scaled versions of each other, then the proposed method becomes equivalent to the GCNN in [21]. On the other hand, if the vertically aligned blocks in the weight matrices of PAWNN are chosen to be scaled versions of a fixed matrix, then the learning model of [22] emerges as a special case. For example, for the Fig.…”
Section: Graph-pruned Neural Network For Dssementioning
confidence: 99%
See 1 more Smart Citation
“…Related work on GRNNs [22]- [25] considers only regression problems and is thus limited to outputting sequences of graph signals, with [22]- [24] targeting traffic forecasting specifically. Other somewhat related works include the gated graph sequence neural networks [26] and the recurrent formulation in [27]. The architecture in [26] learns sequential representations from graphs, and not from graph signals or processes; this is a fundamental difference, since in learning from graphs the graph is seen as data, while in learning from graph signals the graph is given (i.e.…”
Section: Introductionmentioning
confidence: 99%
“…a hyperparameter of the learning architecture). The work in [27] uses recurrence as a means of re-introducing the input at arXiv:2002.01038v1 [eess.SP] 3 Feb 2020 every layer to capture multiple types of diffusion, but does not consider data consisting of temporal sequences. While time gating has been discussed in [22]- [26], the novel spatial gating strategies that we put forward leverage the graph's node and edge structures to control how long range spatial dependencies on the graph are encoded.…”
Section: Introductionmentioning
confidence: 99%