Findings of the Association for Computational Linguistics: EMNLP 2023 2023
DOI: 10.18653/v1/2023.findings-emnlp.418
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Continue Training of Temporal Language Model with Structural Information

Zhaochen Su,
Juntao Li,
Zikang Zhang
et al.

Abstract: Current language models are mainly trained on snap-shots of data gathered at a particular time, which decreases their capability to generalize over time and model language change. To model the time variable, existing works have explored temporal language models (e.g., Tem-poBERT) by directly incorporating the timestamp into the training process. While effective to some extent, these methods are limited by the superficial temporal information brought by timestamps, which fails to learn the inherent changes of l… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 37 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?