Proceedings of the ACM Web Conference 2022 2022
DOI: 10.1145/3485447.3511922
|View full text |Cite
|
Sign up to set email alerts
|

Time-aware Entity Alignment using Temporal Relational Attention

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
28
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(29 citation statements)
references
References 28 publications
1
28
0
Order By: Relevance
“…As the formats of storing temporal information in TKGs are almost identical, the alignment information can be obtained easily, which naturally enhances the EA performance. Nevertheless, recent studies [53,54] that applies TKGs to EA still suffer from the following two problems.…”
Section: G S G Tmentioning
confidence: 99%
See 3 more Smart Citations
“…As the formats of storing temporal information in TKGs are almost identical, the alignment information can be obtained easily, which naturally enhances the EA performance. Nevertheless, recent studies [53,54] that applies TKGs to EA still suffer from the following two problems.…”
Section: G S G Tmentioning
confidence: 99%
“…They [53,54] follow studies that applies KGs to EA, which cannot perform EA until the prealigned seeds are obtained. However, unlike KGs, temporal information in relational triples of TKGs is naturally aligned since they represent real-world time points or time periods [53,54]. This character of TKGs provides the opportunity of developing time-aware EA methods in an unsupervised fashion by treating temporal information as seed alignment.…”
Section: G S G Tmentioning
confidence: 99%
See 2 more Smart Citations
“…Particularly, the time-aware entity alignment approach based on graph neural networks (TEA-GNN) (Xu et al, 2021) first designs a time-aware GNN to cope with TEA, which exploits a time-aware mechanism to introduce the time information into entity embeddings. The time-aware entity alignment using temporal relational attention (TREA) (Xu et al, 2022), on the other hand, incorporates temporal embeddings to enrich the entity embeddings and achieves state-of-the-art performance. Nevertheless, they cannot fully tackle the aforementioned challenges brought by TEA.…”
Section: Introductionmentioning
confidence: 99%