2018
DOI: 10.48550/arxiv.1808.07337
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Expansional Retrofitting for Word Vector Enrichment

Hwiyeol Jo

Abstract: Retrofitting techniques, which inject external resources into word representations, have compensated the weakness of distributed representations in semantic and relational knowledge between words. Implicitly retrofitting word vectors by expansional technique outperforms retrofitting in word similarity tasks with word vector generalization. In this paper, we propose unsupervised extrofitting: expansional retrofitting (extrofitting) without external semantic lexicons. We also propose deep extrofitting: in-depth … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2019
2019
2019
2019

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 16 publications
0
1
0
Order By: Relevance
“…We use GloVe (Pennington et al, 2014) glove.42B.300d as word embedding for M emb . We also perform word vector post-processing method, extrofitting (Jo, 2018), to improve the effect of initialization with pretrained word embeddings on text classification, as described in their paper. We use 4 text classification datasets; IMDB review (Maas et al, 2011), AGNews, Yelp review (Zhang et al, 2015), and Yahoo!Answers (Chang et al, 2008).…”
Section: Experiments Datamentioning
confidence: 99%
“…We use GloVe (Pennington et al, 2014) glove.42B.300d as word embedding for M emb . We also perform word vector post-processing method, extrofitting (Jo, 2018), to improve the effect of initialization with pretrained word embeddings on text classification, as described in their paper. We use 4 text classification datasets; IMDB review (Maas et al, 2011), AGNews, Yelp review (Zhang et al, 2015), and Yahoo!Answers (Chang et al, 2008).…”
Section: Experiments Datamentioning
confidence: 99%