2022
DOI: 10.7717/peerj-cs.877
|View full text |Cite
|
Sign up to set email alerts
|

ACR-SA: attention-based deep model through two-channel CNN and Bi-RNN for sentiment analysis

Abstract: Convolutional Neural Networks (CNN) and Recurrent Neural Networks (RNN) have been successfully applied to Natural Language Processing (NLP), especially in sentiment analysis. NLP can execute numerous functions to achieve significant results through RNN and CNN. Likewise, previous research shows that RNN achieved meaningful results than CNN due to extracting long-term dependencies. Meanwhile, CNN has its advantage; it can extract high-level features using its local fixed-size context at the input level. However… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
2

Relationship

1
8

Authors

Journals

citations
Cited by 22 publications
(8 citation statements)
references
References 59 publications
0
8
0
Order By: Relevance
“…In traditional neural networks, all the input variables are independent of the output variable. Some of the NLP problem examples, such as predicting if the sentence is positive or negative, spam classifier or time-series data, stock forecasting or sales forecasting, can be solved by RNN [ 49 ]. Bag of words, term frequency-inverse document frequency and Word2VEC are used for text preprocessing which convert text into vectors to solve NLP problems in machine learning.…”
Section: Methodsmentioning
confidence: 99%
“…In traditional neural networks, all the input variables are independent of the output variable. Some of the NLP problem examples, such as predicting if the sentence is positive or negative, spam classifier or time-series data, stock forecasting or sales forecasting, can be solved by RNN [ 49 ]. Bag of words, term frequency-inverse document frequency and Word2VEC are used for text preprocessing which convert text into vectors to solve NLP problems in machine learning.…”
Section: Methodsmentioning
confidence: 99%
“…The third gate, the control gate, decides which values will be updated (either 0 or 1), for which a tanh layer creates a vector of Ĉt . The last gate, the output gate, determines the value of the next hidden state [54].…”
Section: Phase IImentioning
confidence: 99%
“…A number of studies have focused on developing new methods for reducing the computational complexity of CNNs. For example, Kamyab et al [ 28 ] proposed an attention-based deep model for sentiment analysis that utilizes a two-channel CNN and a Bi-RNN model (ACR-SA). This model incorporates attention-based mechanisms with novel data processing techniques, word representation, and deep learning algorithms.…”
Section: Related Workmentioning
confidence: 99%