2023
DOI: 10.48550/arxiv.2302.11636
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Do We Really Need Complicated Model Architectures For Temporal Networks?

Abstract: Recurrent neural network (RNN) and self-attention mechanism (SAM) are the de facto methods to extract spatial-temporal information for temporal graph learning. Interestingly, we found that although both RNN and SAM could lead to a good performance, in practice neither of them is always necessary. In this paper, we propose GraphMixer, a conceptually and technically simple architecture that consists of three components: 1 a link-encoder that is only based on multi-layer perceptrons (MLP) to summarize the informa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(11 citation statements)
references
References 12 publications
(18 reference statements)
0
11
0
Order By: Relevance
“…Subsequently, many graph models emerged such as [4,19,22,29,31,36,38], further increasing the popularity of dynamic graph modeling. However, recent research has found that the "graph module" in graph models is not necessary [6,28]. In [6], the authors simply use a Multi-Layer Procedure (MLP) to model one-hop historical neighbors' information and achieve best results than previous graph models, causing researchers to reconsider the necessity of graph modules.…”
Section: Related Work 21 Dynamic Graph Modelingmentioning
confidence: 99%
See 2 more Smart Citations
“…Subsequently, many graph models emerged such as [4,19,22,29,31,36,38], further increasing the popularity of dynamic graph modeling. However, recent research has found that the "graph module" in graph models is not necessary [6,28]. In [6], the authors simply use a Multi-Layer Procedure (MLP) to model one-hop historical neighbors' information and achieve best results than previous graph models, causing researchers to reconsider the necessity of graph modules.…”
Section: Related Work 21 Dynamic Graph Modelingmentioning
confidence: 99%
“…For evaluation, we choose nine dynamic graph modeling methods to compare with ours, including CTDNE [17], DyRep [23], JODIE [11], TGAT [33], TGN [19], CAW [31], TIGER [36], GraMixer [6], PINT [22]. Note that CTDNE can not be applied in the inductive setting, and CAW can not be conducted in the evolving node classification task.…”
Section: Experiments 51 Datasets and Baselinesmentioning
confidence: 99%
See 1 more Smart Citation
“…In addition to these, various studies [24,15,16,17] have been conducted on continuous-time TGNNs. Most research on continuous-time TGNNs has demonstrated their advancement using link prediction as a downstream task, so we mainly investigate link prediction as our target task to study the effect of adversarial attacks.…”
Section: Temporal Graph Neural Network (Tgnns)mentioning
confidence: 99%
“…TGNNs jointly learn the temporal, structural, and contextual relationships present in CTDGs by encoding graph information into time-aware node embeddings. By incorporating temporal information, TGNNs have demonstrated superior performance over static GNNs in various tasks like link prediction [17,18] and dynamic node classification [12]. Nonetheless, in contrast to the actively researched adversarial attacks on static GNNs, the vulnerabilities of TGNNs to adversarial attacks remain underexplored, yet the significance of conducting such research is undeniable.…”
Section: Introductionmentioning
confidence: 99%