2020 IEEE 14th International Conference on Semantic Computing (ICSC) 2020
DOI: 10.1109/icsc.2020.00024
|View full text |Cite
|
Sign up to set email alerts
|

SeMemNN: A Semantic Matrix-Based Memory Neural Network for Text Classification

Abstract: Text categorization is the task of assigning labels to documents written in a natural language, and it has numerous real-world applications including sentiment analysis as well as traditional topic assignment tasks. In this paper, we propose 5 different configurations for the semantic matrix-based memory neural network with end-to-end learning manner and evaluate our proposed method on two corpora of news articles (AG news, Sogou news). The best performance of our proposed method outperforms the baseline VDCNN… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 17 publications
0
3
0
Order By: Relevance
“…Text : For the text emotion classification, we employed SeMemNN [ 55 ] and trained it from scratch. Fu et al [ 55 ] confirmed that SeMemNN has a good ability to learn semantics and fast training speed on a small sample dataset.…”
Section: Proposed Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Text : For the text emotion classification, we employed SeMemNN [ 55 ] and trained it from scratch. Fu et al [ 55 ] confirmed that SeMemNN has a good ability to learn semantics and fast training speed on a small sample dataset.…”
Section: Proposed Methodsmentioning
confidence: 99%
“…Text : For the text emotion classification, we employed SeMemNN [ 55 ] and trained it from scratch. Fu et al [ 55 ] confirmed that SeMemNN has a good ability to learn semantics and fast training speed on a small sample dataset. In SeMemNN, two inputs work together to construct the addressing and semantic matrixes, which yield an address vector to read some corresponding information from the semantic matrix.…”
Section: Proposed Methodsmentioning
confidence: 99%
“…We simply run a text classification based on SeMemNN [6] and BERT [7]. The training-set and testing-set were split as predefined.…”
Section: Text Classificationmentioning
confidence: 99%