2016
DOI: 10.14257/ijgdc.2016.9.11.06
|View full text |Cite
|
Sign up to set email alerts
|

A Sense Embedding of Deep Convolutional Neural Networks for Sentiment Classification

Abstract: Abstract

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 10 publications
0
3
0
Order By: Relevance
“…Distance-based methods: DTW [4] ; Feature-based methods: TSBF [6] , BOSS [7] ; ensemble-based methods: PROP [12] , COTE [13] ; Neural network methods: FCN [17] , ResNet [22] , OS-CNN [26] ,…”
Section: Experiments Results and Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…Distance-based methods: DTW [4] ; Feature-based methods: TSBF [6] , BOSS [7] ; ensemble-based methods: PROP [12] , COTE [13] ; Neural network methods: FCN [17] , ResNet [22] , OS-CNN [26] ,…”
Section: Experiments Results and Analysismentioning
confidence: 99%
“…Fully convolutional network (FCN) [15] uses convolution layer to replace the last full connection layer of deep multilayer perceptron (MLP) [16] , and adds batch standardization layer and global pooling layer to prevent the network from over fitting, which enhance the feature extraction ability of the network. Based on convolutional neural network, multi-scale convolutional neural networks (MCNN) [17] performs identity mapping, smoothing and downsampling to extract the multi-scale features of time series data, respectively, and solve the problem of feature loss. However, its classification performance largely depends on the selection of super parameters and the quality of data preprocessing [18] [19] .…”
Section: Introductionmentioning
confidence: 99%
“…Long short-term memory networks and convolutional neural networks have shown promise in modeling time series data. 23,[27][28][29][30] To exploit the favorable properties of both LSTMs and convolutional neural networks for time series, we used Conv-LSTM layers, conceptualized in 2015 by Shi et al 26 and successfully used by Rahman and Adjeroh. 23 This approach allows for a reduction in the number of LSTM time steps from 10 080 (ie, 1 time step for every minute) to 7 (ie, 1 time step for every day of the week).…”
Section: Convolutional-long Short-term Memory Model With Time Series ...mentioning
confidence: 99%