Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing 2015
DOI: 10.18653/v1/d15-1206
|View full text |Cite
|
Sign up to set email alerts
|

Classifying Relations via Long Short Term Memory Networks along Shortest Dependency Paths

Abstract: Relation classification is an important research arena in the field of natural language processing (NLP). In this paper, we present SDP-LSTM, a novel neural network to classify the relation of two entities in a sentence. Our neural architecture leverages the shortest dependency path (SDP) between two entities; multichannel recurrent neural networks, with long short term memory (LSTM) units, pick up heterogeneous information along the SDP. Our proposed model has several distinct features: (1) The shortest depen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
397
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 561 publications
(399 citation statements)
references
References 22 publications
2
397
0
Order By: Relevance
“…Model F1 SVM (Rink and Harabagiu, 2010) 82.2 CNN (Zeng et al, 2014) 82.7 SDP-LSTM (Xu et al, 2015) 83.7 Att-BLSTM 84.0 BRCNN (Cai et al, 2016) 86.3 Ours 84.1 Xu et al (2015) achieved an F1-score of 83.7% via heterogeneous information along the SDP. BRCNN (Cai et al, 2016) combined CNN and two-channel LSTM units to learns features along SDP, and made use of POS tags, NER and WordNet hypernyms.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Model F1 SVM (Rink and Harabagiu, 2010) 82.2 CNN (Zeng et al, 2014) 82.7 SDP-LSTM (Xu et al, 2015) 83.7 Att-BLSTM 84.0 BRCNN (Cai et al, 2016) 86.3 Ours 84.1 Xu et al (2015) achieved an F1-score of 83.7% via heterogeneous information along the SDP. BRCNN (Cai et al, 2016) combined CNN and two-channel LSTM units to learns features along SDP, and made use of POS tags, NER and WordNet hypernyms.…”
Section: Resultsmentioning
confidence: 99%
“…Santos et al (2015) proposed a similar model named CR-CNN, and replaced the cost function with a rankingbased function. Some models (Xu et al, 2015;Cai et al, 2016) leveraged the shortest dependency path(SDP) between two nominals. Others employed attention mechanism to capture more important semantic information.…”
Section: Related Workmentioning
confidence: 99%
“…Recently, several neural network architectures (Chiu and Nichols, 2015;Lample et al, 2016) have been successfully applied to NER, which is regarded as a sequential token tagging task. Existing methods for relation classification can also be divided into handcrafted feature based methods (Rink, 2010;Kambhatla, 2004) and neural network based methods (Xu, 2015a;Zheng et al, 2016;Zeng, 2014;Xu, 2015b;dos Santos, 2015).…”
Section: Related Workmentioning
confidence: 99%
“…Many systems from the literature make use of dependency features (Nguyen and Grishman, 2015a;Xu et al, 2016Xu et al, , 2015, which, intuitively, should be useful for re- USAGE 296 187 483 323 147 470 953 TOPIC 8 10 18 230 13 243 261 COMPARE 95 -95 41 -41 136 MODEL-FEATURE 226 100 326 123 52 175 501 RESULT 52 20 72 85 38 123 195 PART-WHOLE 158 76 234 117 79 196 430 Total 835 393 1228 919 329 1248 2476 Table 1: Class distribution in subtask 1 training datasets for scenario 1.1 (clean annotation) and scenario 1.2 (noisy annotation). Last column shows the combined counts of both datasets.…”
Section: Svm Modelmentioning
confidence: 99%