2019
DOI: 10.1609/aaai.v33i01.33016326
|View full text |Cite
|
Sign up to set email alerts
|

Training Temporal Word Embeddings with a Compass

Abstract: Temporal word embeddings have been proposed to support the analysis of word meaning shifts during time and to study the evolution of languages. Different approaches have been proposed to generate vector representations of words that embed their meaning during a specific time interval. However, the training process used in these approaches is complex, may be inefficient or it may require large text corpora. As a consequence, these approaches may be difficult to apply in resource-scarce domains or by scientists … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
44
0
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 37 publications
(45 citation statements)
references
References 12 publications
0
44
0
1
Order By: Relevance
“…Over the past few years, different methods have been proposed to align different temporal slices of the same corpus with the goal of quantifying lexical semantic shift in a data-driven and scalable way (Bianchi et al, 2020;Di Carlo et al, 2019;Dubossarsky et al, 2019;Hamilton et al, 2016b;Kim et al, 2014;Kulkarni et al, 2015;Rudolph & Blei, 2018;Tahmasebi et al, 2018;Yao et al, 2018). For example, Hamilton et al (2016b) used Procrustes transformation to align embeddings, while Yao et al (2018) used a joint optimization procedure.…”
Section: Word Embeddingsmentioning
confidence: 99%
See 4 more Smart Citations
“…Over the past few years, different methods have been proposed to align different temporal slices of the same corpus with the goal of quantifying lexical semantic shift in a data-driven and scalable way (Bianchi et al, 2020;Di Carlo et al, 2019;Dubossarsky et al, 2019;Hamilton et al, 2016b;Kim et al, 2014;Kulkarni et al, 2015;Rudolph & Blei, 2018;Tahmasebi et al, 2018;Yao et al, 2018). For example, Hamilton et al (2016b) used Procrustes transformation to align embeddings, while Yao et al (2018) used a joint optimization procedure.…”
Section: Word Embeddingsmentioning
confidence: 99%
“…For example, Hamilton et al (2016b) used Procrustes transformation to align embeddings, while Yao et al (2018) used a joint optimization procedure. More recently, Di Carlo et al (2019) proposed the temporally aligned word embeddings with a compass (TWEC 7 ) model, which extends the continuous bag of words (CBoW) architecture (Mikolov et al, 2013). The CBoW architecture is a neural network with one hidden layer and uses two matrices to learn lexical representations, a target matrix and a context matrix.…”
Section: Word Embeddingsmentioning
confidence: 99%
See 3 more Smart Citations