Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) 2016
DOI: 10.18653/v1/p16-1072
|View full text |Cite
|
Sign up to set email alerts
|

Bidirectional Recurrent Convolutional Neural Network for Relation Classification

Abstract: Relation classification is an important semantic processing task in the field of natural language processing (NLP). In this paper, we present a novel model BRCNN to classify the relation of two entities in a sentence. Some state-of-the-art systems concentrate on modeling the shortest dependency path (SDP) between two entities leveraging convolutional or recurrent neural networks. We further explore how to make full use of the dependency relations information in the SDP, by combining convolutional neural networ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
123
0

Year Published

2017
2017
2019
2019

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 192 publications
(123 citation statements)
references
References 11 publications
0
123
0
Order By: Relevance
“…work focuses on sentence-level RE, i.e., extracting relational facts from a single sentence. In recent years, various neural models have been explored to encode relational patterns of entities for sentence-level RE, and achieve state-of-theart performance (Socher et al, 2012;Zeng et al, 2014Zeng et al, , 2015dos Santos et al, 2015;Xiao and Liu, 2016;Cai et al, 2016;Lin et al, 2016;Wu et al, 2017;Qin et al, 2018;Han et al, 2018a).…”
Section: Introductionmentioning
confidence: 99%
“…work focuses on sentence-level RE, i.e., extracting relational facts from a single sentence. In recent years, various neural models have been explored to encode relational patterns of entities for sentence-level RE, and achieve state-of-theart performance (Socher et al, 2012;Zeng et al, 2014Zeng et al, , 2015dos Santos et al, 2015;Xiao and Liu, 2016;Cai et al, 2016;Lin et al, 2016;Wu et al, 2017;Qin et al, 2018;Han et al, 2018a).…”
Section: Introductionmentioning
confidence: 99%
“…Model F1 SVM (Rink and Harabagiu, 2010) 82.2 CNN (Zeng et al, 2014) 82.7 SDP-LSTM (Xu et al, 2015) 83.7 Att-BLSTM 84.0 BRCNN (Cai et al, 2016) 86.3 Ours 84.1 Xu et al (2015) achieved an F1-score of 83.7% via heterogeneous information along the SDP. BRCNN (Cai et al, 2016) combined CNN and two-channel LSTM units to learns features along SDP, and made use of POS tags, NER and WordNet hypernyms.…”
Section: Resultsmentioning
confidence: 99%
“…Santos et al (2015) proposed a similar model named CR-CNN, and replaced the cost function with a rankingbased function. Some models (Xu et al, 2015;Cai et al, 2016) leveraged the shortest dependency path(SDP) between two nominals. Others employed attention mechanism to capture more important semantic information.…”
Section: Related Workmentioning
confidence: 99%
“…Variants of convolutional networks include piecewise-CNN (PCNN) (Zeng et al, 2014), split CNN (Adel et al, 2016), CNN with sentencewise pooling (Jiang et al, 2016) and attention CNN . Recurrent neural networks (RNN) are another popular choice, and have been used in recent work in the form of recurrent CNNs (Cai et al, 2016) and attention RNNs (Zhou et al, 2016). An instance-level selective attention mechanism was introduced for MIML by , and has significantly improved the prediction accuracy for several of these base deep models.…”
Section: Related Workmentioning
confidence: 99%