2020 International Conference in Mathematics, Computer Engineering and Computer Science (ICMCECS) 2020
DOI: 10.1109/icmcecs47690.2020.246997
|View full text |Cite
|
Sign up to set email alerts
|

The Significance of Global Vectors Representation in Sarcasm Analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 25 publications
0
4
0
Order By: Relevance
“…GloVe is similar to latent semantic analysis but uses co-occurrences in a word matrix and compares texts against an aggregated corpus. GloVe is vital in assessing complex language skills (e.g., sarcasm; Eke et al, 2020) and approximating highly reliable human-rated scores of fluency, elaboration, openness, intellect, and self-reported creative activities (Dumas et al, 2021).…”
Section: Semantic Distancementioning
confidence: 99%
“…GloVe is similar to latent semantic analysis but uses co-occurrences in a word matrix and compares texts against an aggregated corpus. GloVe is vital in assessing complex language skills (e.g., sarcasm; Eke et al, 2020) and approximating highly reliable human-rated scores of fluency, elaboration, openness, intellect, and self-reported creative activities (Dumas et al, 2021).…”
Section: Semantic Distancementioning
confidence: 99%
“…GloVe is similar to latent semantic analysis but uses cooccurrences in a word matrix and compares texts against an aggregated corpus. GloVe is vital in assessing complex language skills (e.g., sarcasm; Eke et al, 2020) and approximating highly reliable human-rated scores of fluency, elaboration, openness, intellect, and self-reported creative activities (Dumas et al, 2021).…”
Section: Semantic Distancementioning
confidence: 99%
“…Word Embedding is a method for converting words in a text into numerical vectors, and it has a wide range of applications in NLP [14]. The process of word embedding is to embed a high-dimensional space whose dimension is the number of all words into a continuous vector space with a much lower dimension, and each word or phrase is mapped to a vector on the real number field.…”
Section: A Word Embeddingmentioning
confidence: 99%