Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL) 2019
DOI: 10.18653/v1/k19-1056
|View full text |Cite
|
Sign up to set email alerts
|

Effective Attention Modeling for Neural Relation Extraction

Abstract: Relation extraction is the task of determining the relation between two entities in a sentence. Distantly-supervised models are popular for this task. However, sentences can be long and two entities can be located far from each other in a sentence. The pieces of evidence supporting the presence of a relation between two entities may not be very direct, since the entities may be connected via some indirect links such as a third entity or via coreference. Relation extraction in such scenarios becomes more challe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
4
1

Relationship

3
6

Authors

Journals

citations
Cited by 17 publications
(10 citation statements)
references
References 17 publications
0
10
0
Order By: Relevance
“…Traditionally, researchers (Mintz et al 2009;Riedel, Yao, and McCallum 2010;Hoffmann et al 2011;Zeng et al 2014;Shen and Huang 2016;Ren et al 2017;Jat, Khandelwal, and Talukdar 2017;Vashishth et al 2018;Ye and Ling 2019;Guo, Zhang, and Lu 2019;Nayak and Ng 2019) used a pipeline approach for relation tuple extraction where relations were identified using a classification network after all entities were detected. Su et al (2018) used an encoder-decoder model to extract multiple relations present between two given entities.…”
Section: Related Workmentioning
confidence: 99%
“…Traditionally, researchers (Mintz et al 2009;Riedel, Yao, and McCallum 2010;Hoffmann et al 2011;Zeng et al 2014;Shen and Huang 2016;Ren et al 2017;Jat, Khandelwal, and Talukdar 2017;Vashishth et al 2018;Ye and Ling 2019;Guo, Zhang, and Lu 2019;Nayak and Ng 2019) used a pipeline approach for relation tuple extraction where relations were identified using a classification network after all entities were detected. Su et al (2018) used an encoder-decoder model to extract multiple relations present between two given entities.…”
Section: Related Workmentioning
confidence: 99%
“…For instance, Vashishth et al [ 10 ] and Hoffmann et al [ 12 ] employed external components or knowledge-based methods to assist the relation extraction. Considerably more researchers have utilized deep neural networks to achieve superior performance [ 8 , 9 , 11 ]. This has led to research into the modification and optimization of deep neural network-based models.…”
Section: Related Workmentioning
confidence: 99%
“…Due to the variety of traits, the extraction pipeline structures are just as diverse. Many researchers have chosen the pipeline approach [8][9][10][11][12] to detect entities and then extract relations between them. For instance, Vashishth et al [10] and Hoffmann et al [12] employed external components or knowledge-based methods to assist the relation extraction.…”
Section: Related Workmentioning
confidence: 99%
“…This feature vector is passed to a feed-forward layer with softmax to determine the relation. Nayak and Ng (2019) used dependency distance based multi-focused attention model for this task. Dependency distance helps to identify the important words in the sentences and multifactor attention helps to focus on multiple pieces of evidence for a relation.…”
Section: Cnn-based Neural Modelsmentioning
confidence: 99%