2018
DOI: 10.1016/j.procs.2018.04.221
|View full text |Cite
|
Sign up to set email alerts
|

A Combination of RNN and CNN for Attention-based Relation Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
24
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 71 publications
(24 citation statements)
references
References 4 publications
0
24
0
Order By: Relevance
“…They tested their model on the dataset and achieved 84.39%. Zhang et al [27] proposed a RCNN model that combines RNN and CNN in the network structure. This model got a F1-score 83.7%.…”
Section: Experiments Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…They tested their model on the dataset and achieved 84.39%. Zhang et al [27] proposed a RCNN model that combines RNN and CNN in the network structure. This model got a F1-score 83.7%.…”
Section: Experiments Resultsmentioning
confidence: 99%
“…Bidirectional LSTM has better performance than standard LSTM. Zhang et al [27] proposed an RCNN (Recurrent Convolutional Neural Networks) model, which combines the advantages of RNN and CNN. It not only solved the problem of long-time dependence with RNN, but also extracted more abundant features with a CNN.…”
Section: Related Workmentioning
confidence: 99%
“…e proposed joint extraction model is applied to Chinese data sets to verify its validity. BRNN [28], SDP-BLSTM [29], CNN [30], Att-RCNN [31], and Hybrid Bi-LSTM-Siamese [32] are also carried out as the base line for the scalability evaluation.…”
Section: Experimental Designmentioning
confidence: 99%
“…Recently Deep Learning has strongly influenced semantic relation learning. Word embeddings can provide attributional features for a variety of learning frameworks (Attia et al, 2016;Vylomova et al, 2016), and the sentential context -in its entirety, or only the structured (through grammatical relations) or unstructured phrase expressing the relation -can be modeled through a variety of neural architectures -CNN (Tan et al, 2018;Ren et al, 2018) or RNN variations (Zhang et al, 2018). Speer et al (2008) introduce AnalogySpace, a representation of concepts and relations in CONCEPT-NET built by factorizing a matrix with concepts on one axis and their features or properties (according to CONCEPTNET) on the other.…”
Section: Semantic Relation Classificationmentioning
confidence: 99%