2019
DOI: 10.1109/access.2019.2943545
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Channel CNN Based Inner-Attention for Compound Sentence Relation Classification

Abstract: Relation classification is a vital task in natural language processing, and it is screening for semantic relation between clauses in texts. This paper describes a study of relation classification on Chinese compound sentences without connectives. There exists an implicit relation in a compound sentence without connectives, which makes it difficult to realize the recognition of relation. The major challenges that relation classification modeling faces are how to obtain the contextual representation of sentence … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(6 citation statements)
references
References 23 publications
0
6
0
Order By: Relevance
“…Generally, in a CNN model, the inputs are connected to some convolution and max-pooling layers, followed by a couple of fully connected layers that are connected to the output layer. But in our case, the preprocessed inputs are fed to a multi-channel CNN model which has been very effective in various text classification tasks [ 27 , 28 ]. The motivation behind this approach was to make sure a sequence is processed at different lengths at a time.…”
Section: Methodsmentioning
confidence: 99%
“…Generally, in a CNN model, the inputs are connected to some convolution and max-pooling layers, followed by a couple of fully connected layers that are connected to the output layer. But in our case, the preprocessed inputs are fed to a multi-channel CNN model which has been very effective in various text classification tasks [ 27 , 28 ]. The motivation behind this approach was to make sure a sequence is processed at different lengths at a time.…”
Section: Methodsmentioning
confidence: 99%
“…CNNs are typical feedforward neural networks with convolutional computations and deep structures. CNNs, as some of the most representative deep learning models, have been widely applied in many fields, and numerous related applications, including image classification [ 24 , 25 , 26 , 27 , 28 ], natural language processing [ 29 , 30 ], face recognition [ 31 , 32 ], video analysis [ 33 , 34 ], and pedestrian detection [ 35 , 36 ].…”
Section: Methodsmentioning
confidence: 99%
“…Following previous work [6], [8], [9], [18] on anaphora resolution, metrics employed to evaluate our model are: precision, recall, and F-score (F). We report the performance for each hyperparameter except as the overall result.…”
Section: B Evaluation Measuresmentioning
confidence: 99%
“…Anaphora resolution is an important sub-task in natural language processing. In recent years, deep learning models for anaphora resolution have been widely investigated [6]- [11]. These methods concentrate on anaphoric pronoun resolution, applying numerous neural network models to pronoun-candidate antecedent prediction.…”
Section: Introductionmentioning
confidence: 99%