2020
DOI: 10.4018/ijswis.2020040101
|View full text |Cite
|
Sign up to set email alerts
|

Distant Supervised Relation Extraction via DiSAN-2CNN on a Feature Level

Abstract: At present, the mainstream distant supervised relation extraction methods existed problems: the coarse granularity for coding the context feature information; the difficulty in capturing the long-term dependency in the sentence, and the difficulty in coding prior knowledge of structures are major issues. To address these problems, we propose a distant supervised relation extraction model via DiSAN-2CNN on feature level, in which multi-dimension self-attention mechanism is utilized to encode the features of the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 14 publications
0
7
0
Order By: Relevance
“…Certain watermark data is used while training the model which helps the system to effectively detect and extract the watermark signal. Lv et al [ 13 ] present a novel application of CNN for Relation extraction which is one of the most important tasks in Natural Language Processing (NLP). Authors have utilized both CNN and directional self-attention network (DiSAN) model where pooling layers of CNN are modified to reduce the dimensions of the feature vector.…”
Section: Introductionmentioning
confidence: 99%
“…Certain watermark data is used while training the model which helps the system to effectively detect and extract the watermark signal. Lv et al [ 13 ] present a novel application of CNN for Relation extraction which is one of the most important tasks in Natural Language Processing (NLP). Authors have utilized both CNN and directional self-attention network (DiSAN) model where pooling layers of CNN are modified to reduce the dimensions of the feature vector.…”
Section: Introductionmentioning
confidence: 99%
“…There is room for improvement in the overall accuracy of the prediction model, particularly in terms of increasing the time of in-advance prediction. Suggested future research directions include (i) increasing the number of data points used through data generation and augmentation techniques [ 58 , 59 ]; (ii) incorporating deep learning to extract features from the input data [ 60 , 61 ]; (iii) investigating the enhancement of algorithms through boosting techniques such as multiple RNNs, multiple GRUs, and multiple LSTMs [ 62 , 63 ]; and (iv) investigating the transition between classes.…”
Section: Discussionmentioning
confidence: 99%
“…Srinivasan and L D, (2019) proposed a twofold convolutional neural network approach with a new activation function, which generalizes faster with higher accuracy. Lv et al (2020) proposed a distant supervised relation extraction model , in which a multidimension self-attention mechanism is utilized to encode the features of the words together with the entire sentence. Siriwon Taewijit, & Thanaruk Theeramunkong (2021) found that DL can be very helpful for diagnosing neurological disorders by its symptoms, because DL can be used to identify patterns for each disorder for identification.…”
Section: Pipeline Methodsmentioning
confidence: 99%
“…Most joint entity-relationship extractions are still supervised learning approaches that require a large amount of high-quality training set (Lv et al, 2020). Recent research tried to deal with the problem of having a relatively small training sample size.…”
Section: Joint Methodsmentioning
confidence: 99%