2022
DOI: 10.3390/app122111068
|View full text |Cite
|
Sign up to set email alerts
|

CRSAtt: By Capturing Relational Span and Using Attention for Relation Classification

Abstract: Relation classification is an important fundamental task in information extraction, and convolutional neural networks have been commonly applied to relation classification with good results. In recent years, due to the proposed pre-training model BERT, the use of which as a feature extraction architecture has become more and more popular, convolutional neural networks have gradually withdrawn from the stage of NLP, and the relation classification/extraction model based on pre-training BERT has achieved state-o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 25 publications
0
3
0
Order By: Relevance
“…Compared with the BiLSTM-attention and multi-attention CNN based on bidirectional long-short-term memory network and attention mechanism, it increased by 3.7% and 2.7%, respectively. Compared with CRSAtt [37] and FA-RCNet [38] based on the BERT pre-trained model, it increased by 0.8% and 0.9%, respectively. The experimental results on the emEval-2010 Task 8 dataset showed that the relationship classification model proposed in this paper also achieved good results on the SemEval-2010 Task 8 dataset.…”
Section: Comparative Experiments Results and Analysismentioning
confidence: 98%
“…Compared with the BiLSTM-attention and multi-attention CNN based on bidirectional long-short-term memory network and attention mechanism, it increased by 3.7% and 2.7%, respectively. Compared with CRSAtt [37] and FA-RCNet [38] based on the BERT pre-trained model, it increased by 0.8% and 0.9%, respectively. The experimental results on the emEval-2010 Task 8 dataset showed that the relationship classification model proposed in this paper also achieved good results on the SemEval-2010 Task 8 dataset.…”
Section: Comparative Experiments Results and Analysismentioning
confidence: 98%
“…Using the enhanced data set for model training can obtain better model weights. CRSAtt [44] proposes a BERT-based relation classification model. CRSAtt processes BERT output features by fusing sentences and entity features.…”
Section: Methodsmentioning
confidence: 99%
“…In computer vision tasks, the visual attention mechanism [45,46] can make the model better notice the key information in the picture. In natural language processing tasks, the attention mechanism [44,47,48] can analyze the relationship between each word in the text and words of different parts of speech through the attention mechanism. The purpose of this is to obtain higher-quality semantic features.…”
Section: Methodsmentioning
confidence: 99%