Proceedings of the 2019 Conference of the North 2019
DOI: 10.18653/v1/n19-1288
|View full text |Cite
|
Sign up to set email alerts
|

Distant Supervision Relation Extraction with Intra-Bag and Inter-Bag Attentions

Abstract: This paper presents a neural relation extraction method to deal with the noisy training data generated by distant supervision. Previous studies mainly focus on sentence-level de-noising by designing neural networks with intra-bag attentions. In this paper, both intrabag and inter-bag attentions are considered in order to deal with the noise at sentence-level and bag-level respectively. First, relationaware bag representations are calculated by weighting sentence embeddings using intrabag attentions. Here, each… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
52
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 105 publications
(52 citation statements)
references
References 13 publications
0
52
0
Order By: Relevance
“…In particular, instead of generating a sentence-level label, this framework assigns a label to a bag of sentences containing a common entity pair, and the label is a relationship of the entity pair on knowledge graph. Recently, based on the labeled data at bag level, a line of works (Zeng et al 2015;Du et al 2018;Lin et al 2016;Han et al 2018;Ye and Ling 2019) under selective attention framework (Lin et al 2016) let model implicitly focus on the correctly labeled sentence(s) by an attention mechanism and thus learn a stable and robust model from the noisy data.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In particular, instead of generating a sentence-level label, this framework assigns a label to a bag of sentences containing a common entity pair, and the label is a relationship of the entity pair on knowledge graph. Recently, based on the labeled data at bag level, a line of works (Zeng et al 2015;Du et al 2018;Lin et al 2016;Han et al 2018;Ye and Ling 2019) under selective attention framework (Lin et al 2016) let model implicitly focus on the correctly labeled sentence(s) by an attention mechanism and thus learn a stable and robust model from the noisy data.…”
Section: Introductionmentioning
confidence: 99%
“…Model comparison regarding the AUC value. The comparative results are reported byHan et al (2018) andYe and Ling (2019) respectively.…”
mentioning
confidence: 99%
“…Initially, a CNN-based method was proposed by [24] to automatically capture the semantics of sentences, while PCNN [25] became the common architecture to embed sentences. PCNN is used in several approaches that handle DS noisy patterns, such as intra-bag attention [5], inter-bag attention [26], soft labeling [27], [28] and adversarial training [29], [30]. Moreover, Graph-CNNs proved an effective way to encode syntactic information from text [8].…”
Section: Related Workmentioning
confidence: 99%
“…The assumption is relaxed by (Riedel et al 2010) as expressed-at-least-once assumption, saying that if two entities participate in a relation, at least one sentence that mentions these two entities might express that relation. To augment the dataset for the relation classification task, (Ye and Ling 2019) found all sentences containing those entities for each pair of entities that appear in some Freebase relation in a large unlabeled corpus extract textual features to train the classifier. (Wei and Zou 2019) proposed an Easy Data Augmentation (EDA) method.…”
Section: Related Workmentioning
confidence: 99%