Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics 2019
DOI: 10.18653/v1/p19-1476
|View full text |Cite
|
Sign up to set email alerts
|

Generalized Tuning of Distributional Word Vectors for Monolingual and Cross-Lingual Lexical Entailment

Abstract: Lexical entailment (LE; also known as hyponymy-hypernymy or is-a relation) is a core asymmetric lexical relation that supports tasks like taxonomy induction and text generation. In this work, we propose a simple and effective method for fine-tuning distributional word vectors for LE. Our Generalized Lexical ENtailment model (GLEN) is decoupled from the word embedding model and applicable to any distributional vector space. Yet-unlike existing retrofitting models-it captures a general specialization function al… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 45 publications
0
4
0
Order By: Relevance
“…Depending on the properties of the annotation, the task is either single-label or multi-label text classification. In single label text classification, each text is assigned exactly one label, which is used in NLP applications where the labels are mutually exclusive, such as in entailment or stance detection (Kim, 2014;Glavaš and Vulić, 2019;Kennedy et al, 2019;Li and Caragea, 2019). In contrast, multi-label text classification assigns any number of categories to a text, which is better suited for tasks where the categories are overlapping or describe complementary aspects, e.g.…”
Section: Background and Related Workmentioning
confidence: 99%
“…Depending on the properties of the annotation, the task is either single-label or multi-label text classification. In single label text classification, each text is assigned exactly one label, which is used in NLP applications where the labels are mutually exclusive, such as in entailment or stance detection (Kim, 2014;Glavaš and Vulić, 2019;Kennedy et al, 2019;Li and Caragea, 2019). In contrast, multi-label text classification assigns any number of categories to a text, which is better suited for tasks where the categories are overlapping or describe complementary aspects, e.g.…”
Section: Background and Related Workmentioning
confidence: 99%
“…First, specialized similarity measures can be defined to distinguish different relations [29]. Another solution is to specialize word embeddings for particular relations using external knowledge [9,36]. However, these methods are one-relation specific and cannot differentiate between multiple semantic relations at a time.…”
Section: Related Workmentioning
confidence: 99%
“…The most studied lexical semantic relations are synonymy, co-hyponymy, hypernymy, or meronymy, but more exist [37]. Numerous approaches have been proposed to identify one particular semantic relation of interest following either the paradigmatic approach [28,29,33,39], the distributional model [9,31,36,37], or their combination [25,32].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation