Predicting Terms in IS-A Relations with Pre-trained Transformers
Irina Nikishina,
Polina Chernomorchenko,
Anastasiia Demidova
et al.
Abstract:In this paper, we explore the ability of the generative transformers to predict objects in IS-A (hypo-hypernym) relations. We solve the task for both directions of the relations: we learn to predict hypernyms given the input word and hyponyms, given the input concept and its neighbourhood from the taxonomy. To the best of our knowledge, this is the first paper which provides a comprehensive analysis of transformerbased models for the task of hypernymy extraction. Apart from the standard finetuning of various g… Show more
Set email alert for when this publication receives citations?
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.