2020
DOI: 10.3390/info11050268
|View full text |Cite
|
Sign up to set email alerts
|

Fully-Unsupervised Embeddings-Based Hypernym Discovery

Abstract: The hypernymy relation is the one occurring between an instance term and its general term (e.g., “lion” and “animal”, “Italy” and “country”). This paper we addresses Hypernym Discovery, the NLP task that aims at finding valid hypernyms from words in a given text, proposing HyperRank, an unsupervised approach that therefore does not require manually-labeled training sets as most approaches in the literature. The proposed algorithm exploits the cosine distance of points in the vector space of word embeddings, as… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 29 publications
0
4
0
Order By: Relevance
“…Zhang et al [21] proposed a method called TaxoGen to detect fine-grained topics using the combination of local-corpus embedding and spherical clustering. Atzori et al [22] proposed an unsupervised NLP-based method called HyperRank for detecting hypernyms from a corpus. This method employs the cosine distance between words in the vector space and does not require manually-labeled training sets.…”
Section: Related Workmentioning
confidence: 99%
“…Zhang et al [21] proposed a method called TaxoGen to detect fine-grained topics using the combination of local-corpus embedding and spherical clustering. Atzori et al [22] proposed an unsupervised NLP-based method called HyperRank for detecting hypernyms from a corpus. This method employs the cosine distance between words in the vector space and does not require manually-labeled training sets.…”
Section: Related Workmentioning
confidence: 99%
“…The task of detecting taxonomic word relations has been tried via various approaches. From purely rule-based (Hearst, 1992), to using semantic treelike resources (Navigli et al, 2011), to adapting pre-trained language models (Atzori and Balloccu, 2020;Chen et al, 2021) or creating hybrid systems (Shwartz et al, 2016;Ravichander et al, 2020). Since this SemEval task is focused on neural (language) models, we opt to use BERT and focus mainly on different data generation techniques.…”
Section: Task Descriptionmentioning
confidence: 99%
“…Since knowledge representation in a KG generates a structured summary while highlighting the proximity of related concepts in a collection of text documents [2], KG can be used as an efficient application in representing knowledge in a set of e-news articles based around a particular news event. One of the most important linguistic relationships that can be found within a KG is the hierarchical relationship "hypernyms" [15,16]. For example, the "is-a" relation of conceptual ontologies relates subordinate words (e.g., "Sri Lanka") with its superordinate (e.g., "Country").…”
Section: Knowledge Graphmentioning
confidence: 99%
“…For example, the "is-a" relation of conceptual ontologies relates subordinate words (e.g., "Sri Lanka") with its superordinate (e.g., "Country"). Hierarchical relation lies at the origin of human reasoning and permits relating words representing specific instances to generic ones; moreover, it is one of extreme importance in any taxonomy [16].…”
Section: Knowledge Graphmentioning
confidence: 99%