2021
DOI: 10.48550/arxiv.2110.02369
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

EntQA: Entity Linking as Question Answering

Abstract: A conventional approach to entity linking is to first find mentions in a given document and then infer their underlying entities in the knowledge base. A well-known limitation of this approach is that it requires finding mentions without knowing their entities, which is unnatural and difficult. We present a new model that does not suffer from this limitation called EntQA, which stands for Entity linking as Question Answering. EntQA first proposes candidate entities with a fast retrieval module, and then scruti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 39 publications
0
1
0
Order By: Relevance
“…Retrieval-based NLP For many NLP tasks [Li et al, 2023a[Li et al, , 2022c, retrieval is an effective method to utilize external knowledge. Knowledge retrieval has been utilized in a variety of NLP tasks [Li et al, 2022b;Li et al, 2022a;], including question answering [Xu et al, 2022;Wang et al, 2022a], machine translation Xu et al, 2020], NER Wang et al, 2022b], and entity linking [Zhang et al, 2021;Li et al, 2023b]. Recently, knowledge retrieval has been introduced to language model pretraining.…”
Section: Ralated Workmentioning
confidence: 99%
“…Retrieval-based NLP For many NLP tasks [Li et al, 2023a[Li et al, , 2022c, retrieval is an effective method to utilize external knowledge. Knowledge retrieval has been utilized in a variety of NLP tasks [Li et al, 2022b;Li et al, 2022a;], including question answering [Xu et al, 2022;Wang et al, 2022a], machine translation Xu et al, 2020], NER Wang et al, 2022b], and entity linking [Zhang et al, 2021;Li et al, 2023b]. Recently, knowledge retrieval has been introduced to language model pretraining.…”
Section: Ralated Workmentioning
confidence: 99%