Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) 2020
DOI: 10.18653/v1/2020.emnlp-main.522
|View full text |Cite
|
Sign up to set email alerts
|

Efficient One-Pass End-to-End Entity Linking for Questions

Abstract: We present ELQ, a fast end-to-end entity linking model for questions, which uses a biencoder to jointly perform mention detection and linking in one pass. Evaluated on WebQSP and GraphQuestions with extended annotations that cover multiple entities per question, ELQ outperforms the previous state of the art by a large margin of +12.7% and +19.6% F1, respectively. With a very fast inference time (1.57 examples/s on a single CPU), ELQ can be useful for downstream question answering systems. In a proof-of-concept… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
65
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 71 publications
(65 citation statements)
references
References 20 publications
0
65
0
Order By: Relevance
“…Our work is also closely related to entity linking and schema linking, which can be viewed as subareas of grounding on specific scenarios. Given an utterance, entity linking aims at finding all mentioned entities in it using a knowledge base as candidate pool (Tan et al, 2017;Chen et al, 2018;Li et al, 2020a), while schema linking tries to find all mentioned schemas related to specific databases (Dong et al, 2019;Lei et al, 2020;Shi et al, 2020). Previous work generally either employed full supervision to train linking models (Li et al, 2020a;Lei et al, 2020;Shi et al, 2020), or treated linking as a minor pre-processing (Yu et al, 2018a;Guo et al, 2019;Lin et al, 2019a) and used heuristic rules to obtain the result.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…Our work is also closely related to entity linking and schema linking, which can be viewed as subareas of grounding on specific scenarios. Given an utterance, entity linking aims at finding all mentioned entities in it using a knowledge base as candidate pool (Tan et al, 2017;Chen et al, 2018;Li et al, 2020a), while schema linking tries to find all mentioned schemas related to specific databases (Dong et al, 2019;Lei et al, 2020;Shi et al, 2020). Previous work generally either employed full supervision to train linking models (Li et al, 2020a;Lei et al, 2020;Shi et al, 2020), or treated linking as a minor pre-processing (Yu et al, 2018a;Guo et al, 2019;Lin et al, 2019a) and used heuristic rules to obtain the result.…”
Section: Related Workmentioning
confidence: 99%
“…Given an utterance, entity linking aims at finding all mentioned entities in it using a knowledge base as candidate pool (Tan et al, 2017;Chen et al, 2018;Li et al, 2020a), while schema linking tries to find all mentioned schemas related to specific databases (Dong et al, 2019;Lei et al, 2020;Shi et al, 2020). Previous work generally either employed full supervision to train linking models (Li et al, 2020a;Lei et al, 2020;Shi et al, 2020), or treated linking as a minor pre-processing (Yu et al, 2018a;Guo et al, 2019;Lin et al, 2019a) and used heuristic rules to obtain the result. Our work is different from them since we optimize the linking model with weak supervision from downstream signals, which is flexible and practicable.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…(Sakor et al, 2019) also jointly extracts relation spans which aide in overall linking performance. The recent ELQ (Li et al, 2020) extends BLINK to jointly learn mention detection and linking. In contrast, we focus solely on linking and take a different strategy based on combining logic rules with learning.…”
Section: Related Workmentioning
confidence: 99%