2018 IEEE International Conference on Big Data (Big Data) 2018
DOI: 10.1109/bigdata.2018.8622636
|View full text |Cite
|
Sign up to set email alerts
|

Temporal Graph Offset Reconstruction: Towards Temporally Robust Graph Representation Learning

Abstract: Graphs are a commonly used construct for representing relationships between elements in complex high dimensional datasets. Many real-world phenomenon are dynamic in nature, meaning that any graph used to represent them is inherently temporal. However, many of the machine learning models designed to capture knowledge about the structure of these graphs ignore this rich temporal information when creating representations of the graph. This results in models which do not perform well when used to make predictions … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
12
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2
1

Relationship

3
4

Authors

Journals

citations
Cited by 12 publications
(12 citation statements)
references
References 33 publications
0
12
0
Order By: Relevance
“…Static models are used on the graph snapshots to make predictions for the next one (DNE ( Du et al, 2018 ), TO-GAE ( Bonner et al, 2018 ), DynGEM ( Goyal et al, 2018 )). Another way of implementing this methodology, called TI-GCN (Time Interval Graph Convolutional Networks) via residual architectures.…”
Section: Related Workmentioning
confidence: 99%
“…Static models are used on the graph snapshots to make predictions for the next one (DNE ( Du et al, 2018 ), TO-GAE ( Bonner et al, 2018 ), DynGEM ( Goyal et al, 2018 )). Another way of implementing this methodology, called TI-GCN (Time Interval Graph Convolutional Networks) via residual architectures.…”
Section: Related Workmentioning
confidence: 99%
“…These approaches have been also explored by several dynamic graph embedding methods [71,81]. Even further, instead of using RNNs to explore the characteristics of the network over time, many succeeding models also employ convolutional networks (for instance, 1D CNNs) to leverage temporal dependencies [82,53,83,84].…”
Section: Approaches Based On Deep Learningmentioning
confidence: 99%
“…Variational graph autoencoder (VGAE) [87] and Graph-GAN [88] employ these approaches to static graphs. These generative models are used in dynamic graphs to learn these data distributions over time [82,31,89,90,91,84,44].…”
Section: Approaches Based On Deep Learningmentioning
confidence: 99%
See 2 more Smart Citations