Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Confere 2015
DOI: 10.3115/v1/p15-2048
|View full text |Cite
|
Sign up to set email alerts
|

Embedding Methods for Fine Grained Entity Type Classification

Abstract: We propose a new approach to the task of fine grained entity type classifications based on label embeddings that allows for information sharing among related labels. Specifically, we learn an embedding for each label and each feature such that labels which frequently co-occur are close in the embedded space. We show that it outperforms state-of-the-art methods on two fine grained entity-classification benchmarks and that the model can exploit the finer-grained labels to improve classification of standard coars… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
151
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 107 publications
(152 citation statements)
references
References 5 publications
1
151
0
Order By: Relevance
“…Thus, as an alternative to our scoring model S c2t (c, t), we could use sentence-level entity classification systems such as FIGER (Ling and Weld, 2012) and (Yogatama et al, 2015)'s system. These systems are based on linguistic features different from the input representation we use, so a comparison with our embedding-based approach is interesting.…”
Section: Future Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Thus, as an alternative to our scoring model S c2t (c, t), we could use sentence-level entity classification systems such as FIGER (Ling and Weld, 2012) and (Yogatama et al, 2015)'s system. These systems are based on linguistic features different from the input representation we use, so a comparison with our embedding-based approach is interesting.…”
Section: Future Workmentioning
confidence: 99%
“…Dong et al (2015) use distributed representations of words in a hybrid classifier to classify mentions to 20 types. Yogatama et al (2015) classify mentions to more fine-grained types by using different features for mentions and embedding labels in the same space. These methods as well as standard NER systems try to maximize correct classification of mentions in individual contexts whereas we aggregate individual contexts and evaluate on accuracy of entity-type assignments inferred from the entire corpus.…”
Section: Related Workmentioning
confidence: 99%
“…Table 1 shows performance of PthDCode on test, based on the interval [40000, 50000]; average and standard deviation are computed for 2000(20 + i), 0 ≤ i ≤ 5, as described above. PthDCode achieves clearly better results than other baseline methods -FIGER (Ling and Weld, 2012), (Yogatama et al, 2015) and (Shimaoka et al, 2017) -when trained on raw (i.e., not denoised) datasets of a similar size. Attentive encoder (Shimaoka et al, 2017) is a neural baseline for PthDCode, to which comparison in Table 1 suggests decoding of path hierarchy rather than flat classification significantly improves the performance.…”
Section: Experiments and Resultsmentioning
confidence: 90%
“…In flat classification (e.g., FIGER (Ling and Weld, 2012), Attentive Encoder (Shimaoka et al, 2016;Shimaoka et al, 2017)), the task is formalized as a flat multiclass multilabel problem. In local classification (Gillick et al, 2014;Yosef et al, 2012;Yogatama et al, 2015), a separate local classifier is learned for each node of the hierarchy. In both approaches, some form of postprocessing is necessary to make the decisions consistent, e.g., an entity can only be a celebrity if they are also a person.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, Yogatama et al [15] proposed a novel method to learn an embedding for each entity type and each feature. This way, feature vectors could be created for entity mentions in order to classify entities.…”
Section: Fine-grained Entity Recognitionmentioning
confidence: 99%