2020 5th International Conference on Computer and Communication Systems (ICCCS) 2020
DOI: 10.1109/icccs49078.2020.9118434
|View full text |Cite
|
Sign up to set email alerts
|

A Commodity Review Sentiment Analysis Based on BERT-CNN Model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
16
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 37 publications
(17 citation statements)
references
References 6 publications
1
16
0
Order By: Relevance
“…It could be because the CNN algorithm might extract local and global features very well from the vectors using the convolutional, the pooling, and the fully connected (dense) layers, which can maintain semantic context meaning on text data. This finding supports studies on sentiment analysis of a commodity review and stance detection for credibility analysis of information on social media conducted by [24], [25]. These studies showed that BERT embeddings and CNN obtained better results than single CNN that ignores relation contextual semantics on text.…”
Section: Discussionsupporting
confidence: 86%
“…It could be because the CNN algorithm might extract local and global features very well from the vectors using the convolutional, the pooling, and the fully connected (dense) layers, which can maintain semantic context meaning on text data. This finding supports studies on sentiment analysis of a commodity review and stance detection for credibility analysis of information on social media conducted by [24], [25]. These studies showed that BERT embeddings and CNN obtained better results than single CNN that ignores relation contextual semantics on text.…”
Section: Discussionsupporting
confidence: 86%
“…BERT is designed to be able to distinguish a process that has a different meaning. BERT uses unlabeled in designing deep bidirectional representations by moving the context left and right across all layers [11], [19]. The results of the previously trained BERT model can then be tuned with one output layer to create a model for performing various tasks including answering questions, language inference without substantial modification of the task-specific architecture.…”
Section: Propose Methodsmentioning
confidence: 99%
“…Literature also revealed studies exploring the more advanced embedding technique, BERT and its variants in improving sentiment analysis for reviews. For instance, [34] improved sentiment analysis for commodity reviews using BERT-CNN with F-score results indicating the combination of BERT-CNN (84.3%) to be the best compared to BERT (82%) and CNN (70.9%). Similarly, [12] developed SenBERT-CNN to analyze JD.com (mobile phone merchant) reviews by combining BERT and CNN, the latter of which was used to extract deep features of the text.…”
Section: Deep Learning Approaches To Reviewsmentioning
confidence: 99%
“…BERT-variant models were pre-trained by incorporating the context of the word within the text in Wikipedia and BooksCorpus [44], and the embedding are then used through a classifier for predictions. As they produce contextualized word embeddings, they produce state-of-the-art results on Natural Language Processing tasks [12,34]. The BERT-base model is a bi-directional (both left-to-right and rightto-left direction) transformer for pre-training over a lot of unlabeled textual data to learn a language representation that can be used to fine-tune for specific classification tasks (see [44] for further details).…”
Section: Feature Extractionmentioning
confidence: 99%