Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Confer 2021
DOI: 10.18653/v1/2021.acl-long.344
|View full text |Cite
|
Sign up to set email alerts
|

Dependency-driven Relation Extraction with Attentive Graph Convolutional Networks

Abstract: Syntactic information, especially dependency trees, has been widely used by existing studies to improve relation extraction with better semantic guidance for analyzing the context information associated with the given entities. However, most existing studies suffer from the noise in the dependency trees, especially when they are automatically generated, so that intensively leveraging dependency information may introduce confusions to relation classification and necessary pruning is of great importance in this … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 70 publications
(18 citation statements)
references
References 40 publications
0
18
0
Order By: Relevance
“…When graph neural networks are used for text processing, serialized text needs to be modeled first so as to introduce text structure information and then combined with downstream tasks to process text data using deep learning-based graph propagation algorithms. Graph neural networks use words as nodes and can introduce additional information to enrich features, such as dependency relationships [14], co-occurrence information between words [15], etc.…”
Section: Graphical Neural Networkmentioning
confidence: 99%
“…When graph neural networks are used for text processing, serialized text needs to be modeled first so as to introduce text structure information and then combined with downstream tasks to process text data using deep learning-based graph propagation algorithms. Graph neural networks use words as nodes and can introduce additional information to enrich features, such as dependency relationships [14], co-occurrence information between words [15], etc.…”
Section: Graphical Neural Networkmentioning
confidence: 99%
“…Ref. [27] proposed a dependencydriven approach for relation extraction with attentive graph convolutional networks (A-GCN). In this approach, an attention mechanism in graph convolutional networks is applied to different contextual words in the dependency tree obtained from an off-the-shelf dependency parser, to distinguish the importance of different word dependencies.…”
Section: Dependency Constraints Integrationmentioning
confidence: 99%
“…Schlichtkrull et al [12] applied Relational Graph Convolutional Network (R-GCN) to two standard knowledge base completion tasks: Link prediction and entity classification. Tian et al [13] proposed a dependencydriven relation extraction method based on Attentive Graph Convolutional Network (A-GCN). In the medical field, Sahu et al [14] used CNN to automatically learn features, and achieved an F1 of 71.6% on the I2B2-2010 clinical relation extraction challenge dataset.…”
Section: Related Workmentioning
confidence: 99%