Proceedings of the 15th Conference of the European Chapter of The Association for Computational Linguistics: Volume 1 2017
DOI: 10.18653/v1/e17-1111
|View full text |Cite
|
Sign up to set email alerts
|

Noise Mitigation for Neural Entity Typing and Relation Extraction

Abstract: In this paper, we address two different types of noise in information extraction models: noise from distant supervision and noise from pipeline input features. Our target tasks are entity typing and relation extraction. For the first noise type, we introduce multi-instance multi-label learning algorithms using neural network models, and apply them to fine-grained entity typing for the first time. Our model outperforms the state-of-the-art supervised approach which uses global embeddings of entities. For the se… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
33
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
3
2

Relationship

1
8

Authors

Journals

citations
Cited by 47 publications
(33 citation statements)
references
References 25 publications
(43 reference statements)
0
33
0
Order By: Relevance
“…Incorporating fine-grained entity types has improved entity-focused downstream tasks, such as relation extraction (Yaghoobzadeh et al, 2017a), question answering (Yavuz et al, 2016), query analysis (Balog and Neumayer, 2012), and coreference resolution (Durrett and Klein, 2014). These systems used a relatively coarse type ontology.…”
Section: Introductionmentioning
confidence: 99%
“…Incorporating fine-grained entity types has improved entity-focused downstream tasks, such as relation extraction (Yaghoobzadeh et al, 2017a), question answering (Yavuz et al, 2016), query analysis (Balog and Neumayer, 2012), and coreference resolution (Durrett and Klein, 2014). These systems used a relatively coarse type ontology.…”
Section: Introductionmentioning
confidence: 99%
“…Zeng et al [13] adopted piecewise convolution neural networks (PCNNs), which use the position information of words in a sentence to model the sentence representation. Yaghoobzadeh et al [14] also tried to mitigate the noise in DS by combining entity type and relation extraction models. Vashishth et al [15] used entity type and relation alias information to impose soft constraints when predicting relations.…”
Section: Related Workmentioning
confidence: 99%
“…The third approach for integrating entity information into a convolutional neural network for relation classification is based on structured prediction, as we originally presented for a table filling evaluation of entity and relation recognition (Adel & Schütze, 2017). While we applied it only to a manually labeled dataset in that previous work, we now adopt it to the slot filling pipeline setting with distantly supervised training data for the first time.…”
Section: Structured Predictionmentioning
confidence: 99%
“…To calculate scores for the possible relations, all three contexts are used. In contrast to the model we proposed earlier (Adel & Schütze, 2017), we again use the flag for the order of the relation arguments for classifying the relation. Also, we do not compute representations for entities for slot filling, as motivated in Section 3.1.…”
Section: Structured Predictionmentioning
confidence: 99%