2023
DOI: 10.1109/access.2023.3259107
|View full text |Cite
|
Sign up to set email alerts
|

Sentiment and Context-Aware Hybrid DNN With Attention for Text Sentiment Classification

Abstract: A massive volume of unstructured data in the form of comments, opinions, and other sorts of data is generated in real-time with the growth of web 2.0. Due to the unstructured nature of the data, building an accurate predictive model for sentiment analysis remains a challenging task. While various DNN architectures have been applied to sentiment analysis with encouraging results, they suffer from high dimensional feature space and consider various features equally. State-of-the-art methods cannot properly lever… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 36 publications
(3 citation statements)
references
References 88 publications
(139 reference statements)
0
2
0
Order By: Relevance
“…However, the word representation present in the framework affects the classification accuracy of the whole framework due to the dimensionality of the word that creates impact in ability of the model. Khan et al (2023) have introduced a Sentiment and Context Aware Attention-based Hybrid Deep Neural Network (SCA-HDNN) model along with attention mechanism which effectively figure out the salient features in the text. Initially, integrated wide coverage sentiment lexicons are used to detect the sentiment features and BERT model is used to detect the sentiment of the extracted words.…”
Section: Related Workmentioning
confidence: 99%
“…However, the word representation present in the framework affects the classification accuracy of the whole framework due to the dimensionality of the word that creates impact in ability of the model. Khan et al (2023) have introduced a Sentiment and Context Aware Attention-based Hybrid Deep Neural Network (SCA-HDNN) model along with attention mechanism which effectively figure out the salient features in the text. Initially, integrated wide coverage sentiment lexicons are used to detect the sentiment features and BERT model is used to detect the sentiment of the extracted words.…”
Section: Related Workmentioning
confidence: 99%
“…The authors (Liu et al, 2023a(Liu et al, , 2023b proposed a new variant of BERT-BASE, which uses explicitly defined word embedding extracted from the sentences, and on the SST-2 data set, it achieved an accuracy of 93.85%. The authors (Khan et al, 2023) proposed a hybrid model by adding an attention mechanism to BERT-CNN and achieved an accuracy of 91.90% on the SST-2 data set. Athira et al (2023) exploited BioBERT, a BERT model developed for biomedical language representation and tested on the QQP data set.…”
Section: Related Workmentioning
confidence: 99%
“…Different types of deep learning models [20][21][22][23] have gained a lot of popularity in NLP tasks such as question answering [24,25], finding the semantic similarity in texts [26,27], text analysis [28,29], etc. The semantic similarity of the two questions is calculated using Siamese MaLSTM in [30].…”
Section: Introductionmentioning
confidence: 99%