Findings of the Association for Computational Linguistics: ACL 2022 2022
DOI: 10.18653/v1/2022.findings-acl.208
|View full text |Cite
|
Sign up to set email alerts
|

Lacking the Embedding of a Word? Look it up into a Traditional Dictionary

Abstract: Word embeddings are powerful dictionaries, which may easily capture language variations. However, these dictionaries fail to give sense to rare words, which are surprisingly often covered by traditional dictionaries. In this paper, we propose to use definitions retrieved in traditional dictionaries to produce word embeddings for rare words. For this purpose, we introduce two methods: Definition Neural Network (DefiNNet) and Define BERT (DefBERT). In our experiments, DefiNNet and DefBERT significantly outperfor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 46 publications
0
2
0
Order By: Relevance
“…[11,100]) and that, following an appropriate methodology, they can be combined to create vectors for entities that do not appear in their pre-training data [27,30]. The questionnaire was composed of nine questions only, to which participants had to answer to with open-ended natural sentences -which by Natural Language Processing (NLP) standards is extremely small [24].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…[11,100]) and that, following an appropriate methodology, they can be combined to create vectors for entities that do not appear in their pre-training data [27,30]. The questionnaire was composed of nine questions only, to which participants had to answer to with open-ended natural sentences -which by Natural Language Processing (NLP) standards is extremely small [24].…”
Section: Discussionmentioning
confidence: 99%
“…However, when it comes to personally familiar people or places, two main challenges arise. First, the extraction of semantic representations for concepts based on to robustly capture their meaning [21][22][23][24][25]. Despite their fundamental importance in our lives, personally familiar people and places (a close friend, one's favourite neighbourhood) never -or extremely rarely -get mentioned in large-scale corpora such as Wikipedia.…”
Section: Introductionmentioning
confidence: 99%