2021
DOI: 10.1057/s41270-021-00109-8
|View full text |Cite
|
Sign up to set email alerts
|

BERT: a sentiment analysis odyssey

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
29
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 80 publications
(46 citation statements)
references
References 44 publications
1
29
0
1
Order By: Relevance
“…Specifically, comments were classified using Bidirectional Encoder Representations from Transformers (BERT), a transformer-based natural language machine learning classification algorithm with outstanding performance on subtle classification tasks because it encodes both semantics and the rich latent structure of sentences ( 5 , 6 ). The superiority of BERT over other machine learning natural language classification models has been repeatedly established in varied real-world social science datasets ( 7 12 ).…”
Section: Methodsmentioning
confidence: 99%
“…Specifically, comments were classified using Bidirectional Encoder Representations from Transformers (BERT), a transformer-based natural language machine learning classification algorithm with outstanding performance on subtle classification tasks because it encodes both semantics and the rich latent structure of sentences ( 5 , 6 ). The superiority of BERT over other machine learning natural language classification models has been repeatedly established in varied real-world social science datasets ( 7 12 ).…”
Section: Methodsmentioning
confidence: 99%
“…Regarding natural language processing (NLP), a tool that has been widely used is the so-called BERT (Devlin et al, 2018 ). The creators of BERT, members of a research team working for Google, mention that BERT incorporates a training database of writings with more than a billion words (see Devlin et al, 2018 ), a feature that has helped BERT reach success in more than 90% of the classification tasks (e.g., Alaparthi and Mishra, 2021 ). Note, however, that even well-trained judges do not agree with each other in rating setiment from personal stories (Tausczik and Pennebaker, 2010 , p. 26).…”
Section: Methods and Proceduresmentioning
confidence: 99%
“…Zhao et al [23] proposed a knowledge-based language representation model BERT for aspect-based sentiment analysis.The main feature of this model is the integration of external emotional domain knowledge into BERT, which can obtain better performance with a small amount of training data. Alaparthi et al [24] compared the relative effectiveness of four sentiment analysis techniques and proved the undisputed advantage of BERT in text sentiment classification. Yenduri et al [25] proposed a novel customized BERT-oriented model for Twitter sentiment classification.…”
Section: Research On Sentiment Analysis Based On Transformermentioning
confidence: 99%
“…BERT [24]: A powerful and open source text pre-training model based on the transformer structure. 6.…”
mentioning
confidence: 99%