2021
DOI: 10.1108/ijicc-06-2021-0109
|View full text |Cite
|
Sign up to set email alerts
|

High accuracy offering attention mechanisms based deep learning approach using CNN/bi-LSTM for sentiment analysis

Abstract: PurposeNeural network (NN)-based deep learning (DL) approach is considered for sentiment analysis (SA) by incorporating convolutional neural network (CNN), bi-directional long short-term memory (Bi-LSTM) and attention methods. Unlike the conventional supervised machine learning natural language processing algorithms, the authors have used unsupervised deep learning algorithms.Design/methodology/approachThe method presented for sentiment analysis is designed using CNN, Bi-LSTM and the attention mechanism. Word2… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(16 citation statements)
references
References 30 publications
0
16
0
Order By: Relevance
“…Especially in situations with multiple classifications, this may increase computational complexity and storage space requirements. At present, the mainstream word vector construction work includes the Context-based Pre-training Word Vector Construction method (Word2vec), the Global Vectors for Word Representation (Glove) and the Transformer-based Pre-training model (BERT) 21 , 22 . In the Word2vec model training, the Continuous Word Bag model (CWB) for predicting intermediate words in the sliding window and the Skip-Gram model for predicting two words on both sides of the known intermediate words are shown in Fig.…”
Section: Design Of Intelligent English Composition Scoring Methods In...mentioning
confidence: 99%
“…Especially in situations with multiple classifications, this may increase computational complexity and storage space requirements. At present, the mainstream word vector construction work includes the Context-based Pre-training Word Vector Construction method (Word2vec), the Global Vectors for Word Representation (Glove) and the Transformer-based Pre-training model (BERT) 21 , 22 . In the Word2vec model training, the Continuous Word Bag model (CWB) for predicting intermediate words in the sliding window and the Skip-Gram model for predicting two words on both sides of the known intermediate words are shown in Fig.…”
Section: Design Of Intelligent English Composition Scoring Methods In...mentioning
confidence: 99%
“…Different from traditional deep learning methods, mapping deep learning focuses on learning the mapping function of input data, rather than directly classifying or regressing the input data. In the mapping depth learning, the input data is nonlinear transformed and feature extracted layer by layer through multi-layer neural network [6]. Each layer will learn more and more abstract feature representation, and finally map the input data into a high-dimensional representation space.…”
Section: Identification Of Database Security Intrusion Behavior Under...mentioning
confidence: 99%
“…Based on the above summary comparison, in the field of sentiment analysis [24], deep learning methods that have been developed in recent years [25][26][27] can automatically and quickly extract relevant features from large-scale text data and capture deep semantic information more easily, with better classification results. However, there are still limitations in word vector representation and the neural network feature extraction processes in deep learning methods [28][29][30], which may lead to incomplete feature extraction or failure to adequately capture semantic information, thus affecting the classification results. To address this problem, this paper constructed BERT and optimized an improved CNN-LSTM model as BERT-ETextCNN-ELSTM (BERT-Enhanced Convolution Neural Networks-Enhanced Long Short-Term Memory) to improve comment sentiment analysis with improved accuracy and efficiency.…”
Section: Related Studiesmentioning
confidence: 99%