2017
DOI: 10.1007/978-3-319-59888-8_15
|View full text |Cite
|
Sign up to set email alerts
|

Joint Entity Recognition and Linking in Technical Domains Using Undirected Probabilistic Graphical Models

Abstract: Abstract. The problems of recognizing mentions of entities in texts and linking them to unique knowledge base identifiers have received considerable attention in recent years. In this paper we present a probabilistic system based on undirected graphical models that jointly addresses both the entity recognition and the linking task. Our framework considers the span of mentions of entities as well as the corresponding knowledge base identifier as random variables and models the joint assignment using a factorize… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 8 publications
(8 citation statements)
references
References 17 publications
0
8
0
Order By: Relevance
“…Dictionary-based approaches can detect and normalize concept mentions in a single step (Tseytlin et al, 2016;Pafilis et al, 2013), even though postfiltering (Basaldella et al, 2017;Cuzzola et al, 2017) or other strategies are usually required to achieve good performance. Examplebased approaches include probabilistic (Leaman and Lu, 2016) and graphical (Lou et al, 2017;ter Horst et al, 2017) systems for jointly learning NER+NEN in shared or interdependent models. Zhao et al (2019) propose a multi-task-learning set-up for neural NER and NEN with bidirectional feedback, as mentioned earlier.…”
Section: Related Workmentioning
confidence: 99%
“…Dictionary-based approaches can detect and normalize concept mentions in a single step (Tseytlin et al, 2016;Pafilis et al, 2013), even though postfiltering (Basaldella et al, 2017;Cuzzola et al, 2017) or other strategies are usually required to achieve good performance. Examplebased approaches include probabilistic (Leaman and Lu, 2016) and graphical (Lou et al, 2017;ter Horst et al, 2017) systems for jointly learning NER+NEN in shared or interdependent models. Zhao et al (2019) propose a multi-task-learning set-up for neural NER and NEN with bidirectional feedback, as mentioned earlier.…”
Section: Related Workmentioning
confidence: 99%
“…The first group joints the ED and EL tasks into one objective function and infers the parameters to automatically leverage their mutual dependency. Luo et al [3] and ter Horst et al [6] modeled these two tasks as a probabilistic factor graph and estimated their parameters based on the gradient descent method. Sil and Yates [18] used a maximum-entropy model to select correct mention-entity pairs by adding the dependent features of NER and EL.…”
Section: Joint Methodsmentioning
confidence: 99%
“…Recently, many researchers have proposed joint methods that couple ED and EL. In one method, their features are designed separately and then integrated into a joint objective function [3], [5], [6]. In another, an interactive architecture is constructed that iteratively exchanges the extracted information between these two tasks [7]- [9].…”
Section: Introductionmentioning
confidence: 99%
“…In each iteration, a segmentation-explorer and a concept-explorer are consecutively applied in order to generate a set of proposal states. The segmentation explorer (recognition) is able to add a new non-overlapping segmentation 7 , remove an existing segmentation, or apply a synonym replacement to a token within an existing segmentation. The concept-explorer (linking) can assign, change or remove a concept to/from any segmentation.…”
Section: Inferencementioning
confidence: 99%
“…This tutorial ends with Section 5 in which we conclude our proposed approach. Parts of the materials presented here are taken from our previous publications [4,7,8].…”
Section: Introductionmentioning
confidence: 99%