2022
DOI: 10.1007/s11042-022-13339-4
|View full text |Cite
|
Sign up to set email alerts
|

An efficient two-state GRU based on feature attention mechanism for sentiment analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 26 publications
(9 citation statements)
references
References 34 publications
0
9
0
Order By: Relevance
“…This investigation clearly identified the difficulties covered in the next section. Preserving the authenticity of the guidelines maintaining the integrity of the specifications [7][8][9].…”
Section: Challenges In Opinion Extraction and Sentiment Analysismentioning
confidence: 99%
“…This investigation clearly identified the difficulties covered in the next section. Preserving the authenticity of the guidelines maintaining the integrity of the specifications [7][8][9].…”
Section: Challenges In Opinion Extraction and Sentiment Analysismentioning
confidence: 99%
“…The model lacks stability issues. Zulqarnain et al [ 22 ] presented a two-state GRU (TS-GRU) depending on the feature attention process, focusing on word-feature capturing and sequential modeling to discover and classify the sentiment polarity. The TS-GRU approach could be more computationally challenging.…”
Section: Literature Reviewmentioning
confidence: 99%
“…The Gated Recurrent Unit, initially introduced by Chung et al [39], tackles the prevalent problem of lengthy contextual connections that often result in gradient degradation within conventional, extensive RNN networks. This innovation has since evolved into a modern architecture, referred to as the "two gated mechanism" approach.…”
Section: Gated Recurrent Unitmentioning
confidence: 99%