Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-1039
|View full text |Cite
|
Sign up to set email alerts
|

Leveraging 2-hop Distant Supervision from Table Entity Pairs for Relation Extraction

Abstract: Distant supervision (DS) has been widely used to automatically construct (noisy) labeled data for relation extraction (RE). Given two entities, distant supervision exploits sentences that directly mention them for predicting their semantic relation. We refer to this strategy as 1-hop DS, which unfortunately may not work well for long-tail entities with few supporting sentences. In this paper, we introduce a new strategy named 2-hop DS to enhance distantly supervised RE, based on the observation that there exis… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 13 publications
(11 citation statements)
references
References 23 publications
0
11
0
Order By: Relevance
“…Specifically, in the proposed framework: 1) To learn a comprehensive sentence representation for each sentence in corpus and web tables (high-quality relational tables extracted from http://websail-fe.cs.northwestern.edu/TabEL/#content-code (access date: 19 April 2013) [10], we firstly combine the multi-head self-attention mechanism and piecewise convolution neural network (PCNN) for the sentence encoder; 2) In this stage, we use a noise detection strategy to address the issue of noisy labeling. To evaluate the correlation of sentences and labels, we calculate the similarity between entity-aware embeddings and each sentence representation.…”
Section: Bagmentioning
confidence: 99%
See 4 more Smart Citations
“…Specifically, in the proposed framework: 1) To learn a comprehensive sentence representation for each sentence in corpus and web tables (high-quality relational tables extracted from http://websail-fe.cs.northwestern.edu/TabEL/#content-code (access date: 19 April 2013) [10], we firstly combine the multi-head self-attention mechanism and piecewise convolution neural network (PCNN) for the sentence encoder; 2) In this stage, we use a noise detection strategy to address the issue of noisy labeling. To evaluate the correlation of sentences and labels, we calculate the similarity between entity-aware embeddings and each sentence representation.…”
Section: Bagmentioning
confidence: 99%
“…Beltagy et al [9] combined distant supervision with a directly supervised data and used it to improve weights of relevant sentences. Deng et al [10] proposed a hierarchical framework to fuse information from DS and web tables that share relational facts about entities to further improve RE.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations