Findings of the Association for Computational Linguistics: EMNLP 2020 2020
DOI: 10.18653/v1/2020.findings-emnlp.42
|View full text |Cite
|
Sign up to set email alerts
|

A Fully Hyperbolic Neural Model for Hierarchical Multi-Class Classification

Abstract: Label inventories for fine-grained entity typing have grown in size and complexity. Nonetheless, they exhibit a hierarchical structure. Hyperbolic spaces offer a mathematically appealing approach for learning hierarchical representations of symbolic data. However, it is not clear how to integrate hyperbolic components into downstream tasks. This is the first work that proposes a fully hyperbolic model for multi-class multi-label classification, which performs all operations in hyperbolic space.We evaluate the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
28
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 23 publications
(28 citation statements)
references
References 27 publications
0
28
0
Order By: Relevance
“…We also list the numbers from prior work in Table 2. HY XLarge (López and Strube, 2020), a hyperbolic model designed to learn hierarchical structure in entity types, exceeds the performance of the models with similar sizes such as Choi et al (2018) and Xiong et al (2019) especially in macrorecall. In the ultra-fine class, both our box-based model and HY XLarge achieve higher macro-F1 compared to their vector-based counterparts.…”
Section: Entity Typingmentioning
confidence: 99%
See 3 more Smart Citations
“…We also list the numbers from prior work in Table 2. HY XLarge (López and Strube, 2020), a hyperbolic model designed to learn hierarchical structure in entity types, exceeds the performance of the models with similar sizes such as Choi et al (2018) and Xiong et al (2019) especially in macrorecall. In the ultra-fine class, both our box-based model and HY XLarge achieve higher macro-F1 compared to their vector-based counterparts.…”
Section: Entity Typingmentioning
confidence: 99%
“…However, OE can only handle binary entailment decisions, and POE cannot model negative correlations between types, a critical limitation in its use as a probabilistic model; these shortcomings directly led to the development of box embeddings. Hyperbolic embeddings (Nickel and Kiela, 2017; López and Strube, 2020) can also model hierarchical relationships as can hyperbolic entailment cones (Ganea et al, 2018); however, these approaches lack a probabilistic interpretation.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…where λ is the relative weight of these two losses 3 . Baselines For Ultra-Fine dataset, we compare with following baselines: Onoe and Durrett ( 2019) which offers two multi-classifiers using BERT and ELMo as encoder respectively, Choi et al (2018) which is a multi-classifier using GloVe+LSTM as encoder, Xiong et al (2019) which is a multiclassifier using GloVe+LSTM as encoder and exploits label co-occurrence via introducing associated labels to enrich the label representation, López and Strube (2020) which is a hyperbolic multiclassifier using GloVe. For OntoNotes dataset, in addition to the baselines for Ultra-Fine, we also compare with which offers a multi-classifier using BERT as encoder, Lin and Ji (2019) which offers a multi-classifier using ELMo as encoder and exploits label co-occurrence via requiring the latent representation to reconstruct the co-occurrence association and which offers a multi-classifier using ELMo as encoder and exploits label hierarchy via designing a hierarchy-aware loss function.…”
Section: Learningmentioning
confidence: 99%