2018
DOI: 10.48550/arxiv.1812.04206
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

GC-LSTM: Graph Convolution Embedded LSTM for Dynamic Link Prediction

Jinyin Chen,
Xueke Wang,
Xuanheng Xu

Abstract: Dynamic link prediction is a research hot in complex networks area, especially for its wide applications in biology, social network, economy and industry. Compared with static link prediction, dynamic one is much more difficult since network structure evolves over time. Currently most researches focus on static link prediction which cannot achieve expected performance in dynamic network. Aiming at low AUC, high Error Rate, add/remove link prediction difficulty, we propose GC-LSTM, a Graph Convolution Network (… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
35
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 25 publications
(35 citation statements)
references
References 40 publications
0
35
0
Order By: Relevance
“…GCN [20] and GraphSage [23] are supervised methods that include static graph structure and node attributes; they both ignore temporal information. GC-LSTM [24] is another supervised method that utilizes the temporal information of both graphs and node attributes. RNNGCN [2] employed a 2-layer GCN and introduced a decay weight as a learnable parameter; information from each timestep was multiplied by this weight, which decayed over time, and the resulting linear combination over time was used for classification.…”
Section: B Baselines and Metricsmentioning
confidence: 99%
“…GCN [20] and GraphSage [23] are supervised methods that include static graph structure and node attributes; they both ignore temporal information. GC-LSTM [24] is another supervised method that utilizes the temporal information of both graphs and node attributes. RNNGCN [2] employed a 2-layer GCN and introduced a decay weight as a learnable parameter; information from each timestep was multiplied by this weight, which decayed over time, and the resulting linear combination over time was used for classification.…”
Section: B Baselines and Metricsmentioning
confidence: 99%
“…The purpose of GNN is to apply deep neural network model to graph representation learning, and map graphs to low-dimensional vector spaces for downstream tasks, such as link prediction [35,4], node classification [23,1], recommendation [22,36,7], etc. The notion of graph neural networks is initially outlined in [10], then Franco et al [24] extended recursive neural networks to graph learning task and Li et al [18] treated the neighborhood information as the time step input of gated recurrent units.…”
Section: Graph Neural Networkmentioning
confidence: 99%
“…In order to deal with the complicated time-varied graphs, it is necessary and crucial to preprocess the raw dynamic graph representations, which record all continuous evolution of the graph over time, such as node emerging/disappearing and link addition/deletion [9,10,11,12,13]. Current researches [14,15,16,17,18,19,20] refine the raw dynamic representations to two main branches, dynamic continuous and discrete graphs. For the former graphs, the raw representations are projected to a single 2D temporal graph, storing most information in graph evolution.…”
Section: Introductionmentioning
confidence: 99%
“…It is crucial to have an efficient and powerful network under specific tasks in this step. Some researches [16,19,20] utilize recurrent neural networks (RNNs) to scrutinize representations on the sequence of dynamic graphs. However, RNNbased DGNNs are more time-consuming and inadequately handle sequential time-dependent embedding with increasing moments of time-steps.…”
Section: Introductionmentioning
confidence: 99%