Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) 2016
DOI: 10.18653/v1/p16-1059
|View full text |Cite
|
Sign up to set email alerts
|

Collective Entity Resolution with Multi-Focal Attention

Abstract: Entity resolution is the task of linking each mention of an entity in text to the corresponding record in a knowledge base (KB). Coherence models for entity resolution encourage all referring expressions in a document to resolve to entities that are related in the KB. We explore attentionlike mechanisms for coherence, where the evidence for each candidate is based on a small set of strong relations, rather than relations to all other entities in the document. The rationale is that documentwide support may simp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
137
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
5
5

Relationship

0
10

Authors

Journals

citations
Cited by 106 publications
(137 citation statements)
references
References 26 publications
0
137
0
Order By: Relevance
“…Many approaches perform joint inference over the linking decisions in a document (Milne and Witten, 2008;Ratinov et al, 2011;Hoffart et al, 2011;Globerson et al, 2016), identify mentions that do not link to any existing entity (NIL) (Bunescu and Pasca, 2006;Ratinov et al, 2011), and cluster NIL-mentions (Wick et al, 2013;Lazic et al, 2015) to discover new entities. Few approaches jointly model entity linking, and other related NLP tasks to improve linking, such as, coreference resolution (Hajishirzi et al, 2013), relational inference (Cheng and Roth, 2013), and joint coreference with typing (Durrett and Klein, 2014 Figure 1: Overview of the Model ( § 3): Each entity has a Wikipedia description, linked mentions in Wikipedia (only one shown), and fine-grained types from Freebase (only one shown).…”
Section: Related Workmentioning
confidence: 99%
“…Many approaches perform joint inference over the linking decisions in a document (Milne and Witten, 2008;Ratinov et al, 2011;Hoffart et al, 2011;Globerson et al, 2016), identify mentions that do not link to any existing entity (NIL) (Bunescu and Pasca, 2006;Ratinov et al, 2011), and cluster NIL-mentions (Wick et al, 2013;Lazic et al, 2015) to discover new entities. Few approaches jointly model entity linking, and other related NLP tasks to improve linking, such as, coreference resolution (Hajishirzi et al, 2013), relational inference (Cheng and Roth, 2013), and joint coreference with typing (Durrett and Klein, 2014 Figure 1: Overview of the Model ( § 3): Each entity has a Wikipedia description, linked mentions in Wikipedia (only one shown), and fine-grained types from Freebase (only one shown).…”
Section: Related Workmentioning
confidence: 99%
“…Lazic et al (2015) 86. 4 Huang et al (2015) 86.6 Chisholm and Hachey (2015) 88.7 Ganea et al (2016) 87.6 Guo and Barbosa (2016) 89.0 Globerson et al (2016) 91.0 Yamada et al (2016) 91.5 Ganea and Hofmann (2017) 92.2…”
Section: Systemmentioning
confidence: 99%
“…For example the local component of the GLOW algorithm (Ratinov et al, 2011) was used as part of the relational inference system suggested by Cheng and Roth (2013). Similarly, Globerson et al (2016) achieved state-of-the-art results by extending the local-based selective-context model of Lazic et al (2015) with an attention-like coherence model.…”
Section: Related Workmentioning
confidence: 84%