2022
DOI: 10.1609/aaai.v36i11.21682
|View full text |Cite
|
Sign up to set email alerts
|

Learning to Evolve on Dynamic Graphs (Student Abstract)

Abstract: Representation learning in dynamic graphs is a challenging problem because the topology of graph and node features vary at different time. This requires the model to be able to effectively capture both graph topology information and temporal information. Most existing works are built on recurrent neural networks (RNNs), which are used to exact temporal information of dynamic graphs, and thus they inherit the same drawbacks of RNNs. In this paper, we propose Learning to Evolve on Dynamic Graphs (LEDG) - a novel… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 1 publication
0
3
0
Order By: Relevance
“…Now, we present comprehensive experiments to evaluate our proposed framework. We borrow datasets and its preprocessing/splitting settings used in previous best baselines (Xiang, Huang, and Wang 2022;Pareja et al 2020). Datasets: Table ??…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Now, we present comprehensive experiments to evaluate our proposed framework. We borrow datasets and its preprocessing/splitting settings used in previous best baselines (Xiang, Huang, and Wang 2022;Pareja et al 2020). Datasets: Table ??…”
Section: Methodsmentioning
confidence: 99%
“…The best hyperparameters search has the range as: Number of layers ∈ {1, 2}, Hidden dimension ∈ {32, 64, 128}, Number of heads ∈ {4, 8, 16}, Filter order ∈ {4, 8, 16}, Wavelet scales ∈ [0.1, 10]. The rest of the parameters and settings are borrowed from previous works (Pareja et al 2020;Xiang, Huang, and Wang 2022). Variants of our Framework: In the main results we show three variants of our method with different aggregators.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation