2015
DOI: 10.13064/ksss.2015.7.4.041
|View full text |Cite
|
Sign up to set email alerts
|

A Study on Word Vector Models for Representing Korean Semantic Information

Abstract: This paper examines whether the Global Vector model is applicable to Korean data as a universal learning algorithm. The main purpose of this study is to compare the global vector model (GloVe) with the word2vec models such as a continuous bag-of-words (CBOW) model and a skip-gram (SG) model. For this purpose, we conducted an experiment by employing an evaluation corpus consisting of 70 target words and 819 pairs of Korean words for word similarities and analogies, respectively. Results of the word similarity t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 5 publications
(3 reference statements)
0
1
0
Order By: Relevance
“…GloVe is an unsupervised learning algorithm for obtaining word embedding or vector representation of words [81]. The idea is based on obtaining the statistics of word-word co-occurrence from a corpus, which results in representations that showcase interesting linear structures of the word vector space [82].…”
Section: Multilayer Perceptronmentioning
confidence: 99%
“…GloVe is an unsupervised learning algorithm for obtaining word embedding or vector representation of words [81]. The idea is based on obtaining the statistics of word-word co-occurrence from a corpus, which results in representations that showcase interesting linear structures of the word vector space [82].…”
Section: Multilayer Perceptronmentioning
confidence: 99%