2020
DOI: 10.1007/s13042-020-01069-8
|View full text |Cite
|
Sign up to set email alerts
|

From static to dynamic word representations: a survey

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
28
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 61 publications
(28 citation statements)
references
References 68 publications
0
28
0
Order By: Relevance
“…Neural network language models introduced the idea of deep learning into language modeling by learning a distributed representation of words. These distributed word representations, trained on massive amounts of unannotated textual data, have been proved to provide good lower dimension feature representations in a wide range of NLP tasks (Wang et al, 2020). The Continuous Bag-of-Words and Skip-gram models proposed to reduce the computational complexity were considered as a milestone in the development of the so-called word embeddings (Mikolov et al, 2013), followed by the Global Vector (GloVe) (Pennington et al, 2014) and the fastText (Bojanowski et al, 2016) models.…”
Section: Introductionmentioning
confidence: 99%
“…Neural network language models introduced the idea of deep learning into language modeling by learning a distributed representation of words. These distributed word representations, trained on massive amounts of unannotated textual data, have been proved to provide good lower dimension feature representations in a wide range of NLP tasks (Wang et al, 2020). The Continuous Bag-of-Words and Skip-gram models proposed to reduce the computational complexity were considered as a milestone in the development of the so-called word embeddings (Mikolov et al, 2013), followed by the Global Vector (GloVe) (Pennington et al, 2014) and the fastText (Bojanowski et al, 2016) models.…”
Section: Introductionmentioning
confidence: 99%
“…Our proposed research methods based on textual alignment and word embeddings can naturally be extended to other languages, since alignment methods have a strong foundation in a multilingual context (Tiedemann 2011). Furthermore word embeddings for other languages (Wang et al 2020) can be utilized for the detection of paraphrase types for multiple languages.…”
Section: Discussionmentioning
confidence: 99%
“…However, for complex tasks, which require finer discrimination between classes, their main drawback is that Figure 2 The bag of polarities representation; in it each document from the collection is represented by a dual vector that captures the occurrence of words in both positive and negative contexts they do not capture information of the contexts of the words. To address this limitation, different types of contextualized word embeddings (e.g., ELMo, Flair, and BERT) have been recently proposed [45,46]. Their idea is to add syntactic and semantic information to the words' representations aiming to dynamically capture their meaning.…”
Section: Bag Of Polarities: a Dual Word-sentiment Representation For Depression Detectionmentioning
confidence: 99%