2019
DOI: 10.1609/aaai.v33i01.33016513
|View full text |Cite
|
Sign up to set email alerts
|

Neural Relation Extraction within and across Sentence Boundaries

Abstract: Past work in relation extraction mostly focuses on binary relation between entity pairs within single sentence. Recently, the NLP community has gained interest in relation extraction in entity pairs spanning multiple sentences. In this paper, we propose a novel architecture for this task: inter-sentential dependency-based neural networks (iDepNN). iDepNN models the shortest and augmented dependency paths via recurrent and recursive neural networks to extract relationships within (intra-) and across (inter-) se… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
79
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 92 publications
(79 citation statements)
references
References 23 publications
0
79
0
Order By: Relevance
“…Additionally, we apply a decision function such that a sentence is tagged as propaganda if prediction probability of the classifier is greater than a threshold (τ ). We relax the binary decision boundary to boost recall, similar to Gupta et al (2019b).…”
Section: Sentence-level Propaganda Detectionmentioning
confidence: 99%
“…Additionally, we apply a decision function such that a sentence is tagged as propaganda if prediction probability of the classifier is greater than a threshold (τ ). We relax the binary decision boundary to boost recall, similar to Gupta et al (2019b).…”
Section: Sentence-level Propaganda Detectionmentioning
confidence: 99%
“…In particular, extracting cross-sentence events, where relevant entities are found across multiple sentences, is considered challenging because of the lack of clear linguistic evidence for how entities mentioned in different sentences are associated to form a single event. For this reason, few studies have looked into this issue [15][16][17][18][19][20].…”
Section: Introductionmentioning
confidence: 99%
“…Some follow-up studies have been conducted since then, and updated the state-of-the-art performance with diverse types of neural networks built over syntactic dependency paths between entities. Examples include bidirectional long-short term memory networks [23], gated recurrent unit networks [24], and recursive and recurrent neural networks [20]. However, they either did not extract cross-sentence events at all, or achieved the best results only when cross-sentence events are ignored.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Motivation: Wikipedia and knowledge repositories derived from it are useful in a variety of tasks. They pertain to knowledge acquisition from text [12,14,18,[30][31][32], text analysis [22,23] and information retrieval [3,5,6,15,16,24,28,33] including commercial Web search, helping to potentially transform search results from sets of hyperlinks to relevant documents into sets of concepts directly relevant to users' queries [26]. Most Wikipedia articles correspond to concepts that are instances ("Wynnewood Valley Park Sensory Garden") as opposed to classes ("Garden").…”
Section: Introductionmentioning
confidence: 99%