2022
DOI: 10.3233/sw-212955
|View full text |Cite
|
Sign up to set email alerts
|

Taxonomy enrichment with text and graph vector representations

Abstract: Knowledge graphs such as DBpedia, Freebase or Wikidata always contain a taxonomic backbone that allows the arrangement and structuring of various concepts in accordance with hypo-hypernym (“class-subclass”) relationship. With the rapid growth of lexical resources for specific domains, the problem of automatic extension of the existing knowledge bases with new words is becoming more and more widespread. In this paper, we address the problem of taxonomy enrichment which aims at adding new words to the existing t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 59 publications
0
2
0
Order By: Relevance
“…There exist several recent papers on Taxonomy Enrichment that make use of word vector representations and/or large pre-trained language models. For instance, (Nikishina et al, 2022a) present an approach applying numerous of text and graph embeddings as well as their combinations; (Takeoka et al, 2021) solves the same problem, but for the low-resource scenario using BERT-based classifier, while Roller et al (2018) revise Hearst Patterns for the task. Cho et al (2020), view taxonomy enrichment as a sequence-to-sequence problem and the authors train an LSTM model on the WordNet data.…”
Section: Related Workmentioning
confidence: 99%
“…There exist several recent papers on Taxonomy Enrichment that make use of word vector representations and/or large pre-trained language models. For instance, (Nikishina et al, 2022a) present an approach applying numerous of text and graph embeddings as well as their combinations; (Takeoka et al, 2021) solves the same problem, but for the low-resource scenario using BERT-based classifier, while Roller et al (2018) revise Hearst Patterns for the task. Cho et al (2020), view taxonomy enrichment as a sequence-to-sequence problem and the authors train an LSTM model on the WordNet data.…”
Section: Related Workmentioning
confidence: 99%
“…The sixth paper, "Taxonomy Enrichment with Text and Graph Vector Representations" [16], by Irina Nikishina, Mikhail Tikhomirov, Varvara Logacheva, Yuriy Nazarov, Alexander Panchenko, and Natalia Loukachevitch, targets the problem of taxonomy enrichment which aims at adding new words to the existing taxonomy. This paper provides a comprehensive study of the existing approaches to taxonomy enrichment based on word and graph vector representations.…”
Section: Contentmentioning
confidence: 99%