Proceedings of the 2022 2nd International Conference on Modern Educational Technology and Social Sciences (ICMETSS 2022) 2022
DOI: 10.2991/978-2-494069-45-9_33
|View full text |Cite
|
Sign up to set email alerts
|

Emotion Analysis of Microblog Epidemic Coexistence Based on BERT

Abstract: Since the outbreak of the novel coronavirus, a large number of scholars have put forward scientific and effective analysis on epidemic prevention and control through obtaining microblog data. The spatial-temporal analysis of the evolution of public opinion on COVID-19 based on microblog data is conducive to better fight against the epidemic. In this paper, a large number of public comments on Sina Weibo in different periods of time were counted, and BERT technology was used to make emotional analysis, so as to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 11 publications
0
1
0
Order By: Relevance
“…High-quality models can be produced quickly and effectively with little effort and training. It focuses on employing the novel Masked Language Model (MLM) as opposed to the conventional one-way language model or the technique of shallow splicing two one-way language models for pre-training in order to construct an intricate bidirectional language representation [34]. Despite the model being large and slow to train due to the training framework and corpus, BERT can process more text and language.…”
Section: Methodsmentioning
confidence: 99%
“…High-quality models can be produced quickly and effectively with little effort and training. It focuses on employing the novel Masked Language Model (MLM) as opposed to the conventional one-way language model or the technique of shallow splicing two one-way language models for pre-training in order to construct an intricate bidirectional language representation [34]. Despite the model being large and slow to train due to the training framework and corpus, BERT can process more text and language.…”
Section: Methodsmentioning
confidence: 99%