2020
DOI: 10.1016/j.neucom.2020.03.094
|View full text |Cite
|
Sign up to set email alerts
|

Sentiment aware word embeddings using refinement and senti-contextualized learning approach

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
18
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 29 publications
(18 citation statements)
references
References 19 publications
0
18
0
Order By: Relevance
“…The above two methods make Word2Vec suitable for large-scale corpus training. Naderalvojoud et al proposed an improved word embedding method, which added emotional information encoding to the word vector pre-trained by Word2Vec, making the word vector suitable for emotional text analysis [19]. Onan combined word embedding methods with cluster analysis, and it was proved that the text analysis accuracy was improved by the method [20].…”
Section: Word Vector Generationmentioning
confidence: 99%
“…The above two methods make Word2Vec suitable for large-scale corpus training. Naderalvojoud et al proposed an improved word embedding method, which added emotional information encoding to the word vector pre-trained by Word2Vec, making the word vector suitable for emotional text analysis [19]. Onan combined word embedding methods with cluster analysis, and it was proved that the text analysis accuracy was improved by the method [20].…”
Section: Word Vector Generationmentioning
confidence: 99%
“…Word embedding is a technique for analyzing the context of words in a sentence and converting them into a vector value. Word embedding methods include GloVe [16], Fasttext [17], and Word2vec [18]. Word2vec fails to consider the cooccurrence frequency of entire sentences because learning is performed only in a user-specified window.…”
Section: B Word2vec-based Word Embeddingmentioning
confidence: 99%
“…For example, 'kill' à (ki, il, ll). This generates a vector of words that are not found in the dictionary; moreover, its training process is fast [17]. Word2vec was proposed by Google after the neural network language model was improved [18].…”
Section: B Word2vec-based Word Embeddingmentioning
confidence: 99%
See 1 more Smart Citation
“…In some NLP tasks like information retrieval [21,22], this situation may not be a problem. However, the performance of some NLP tasks such as sentiment analysis will be affected by this fact, because in sentiment analysis we do not rely only on contextual information to determine the sentence sentiment [23,24]. Since word embedding models ignore sentiment information, the closest words in the vector space may have opposite sentiment polarities.…”
Section: Introductionmentioning
confidence: 99%