2021
DOI: 10.3390/asi4040085
|View full text |Cite
|
Sign up to set email alerts
|

An Improved Model for Analyzing Textual Sentiment Based on a Deep Neural Network Using Multi-Head Attention Mechanism

Abstract: Due to the increasing growth of social media content on websites such as Twitter and Facebook, analyzing textual sentiment has become a challenging task. Therefore, many studies have focused on textual sentiment analysis. Recently, deep learning models, such as convolutional neural networks and long short-term memory, have achieved promising performance in sentiment analysis. These models have proven their ability to cope with the arbitrary length of sequences. However, when they are used in the feature extrac… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 21 publications
(10 citation statements)
references
References 50 publications
0
10
0
Order By: Relevance
“…The attention mechanism is a machine learning technique that enables the model to focus on the most relevant and important parts when processing input data, thereby improving the performance and efficiency of the model (Sharaf Al-deen et al, 2021 ). The attention mechanism is inspired by human visual attention, that is, when humans observe a scene, they will automatically focus on the region of interest and ignore other irrelevant regions.…”
Section: Methodsmentioning
confidence: 99%
“…The attention mechanism is a machine learning technique that enables the model to focus on the most relevant and important parts when processing input data, thereby improving the performance and efficiency of the model (Sharaf Al-deen et al, 2021 ). The attention mechanism is inspired by human visual attention, that is, when humans observe a scene, they will automatically focus on the region of interest and ignore other irrelevant regions.…”
Section: Methodsmentioning
confidence: 99%
“…The summaries of the literature review can be shown in TABLE I. CNN Ref [7] Stacked residual LSTM Ref [9] SR-LSTM Ref [10] CNN-LSTM Ref [11] Co-LSTM Ref [12] CNN-RNN-LSTM Ref [13] Self-Attention Ref [14] CNN-BiLSTM model with a scalable multi-channel dilated joint design Ref [16] DNN-Multi-Head Attention Ref [17] pre-Attention based BiLSTM Ref [19] BERT Ref [20] multi-modal recurrent neural network-based Attention Ref [21] Attention over Attention…”
Section: Related Workmentioning
confidence: 99%
“…The authors’ model addresses the complexities of other languages with a character embedding. Al-deen et al ( 2021 ) proposed a combination of deep neural networks with a multi-head attention mechanism. They also conducted sentiment analysis and text classification tasks using a large Twitter dataset on the major US airline problems to experiment the performance of their model.…”
Section: Social Network Analysis and Decision-making Support For Airl...mentioning
confidence: 99%