2021 13th International Conference on Machine Learning and Computing 2021
DOI: 10.1145/3457682.3457717
|View full text |Cite
|
Sign up to set email alerts
|

A Stance Detection Approach Based on Generalized Autoregressive pretrained Language Model in Chinese Microblogs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 4 publications
0
1
0
Order By: Relevance
“…Both are large language models based on the Transformer architecture (Vaswani et al 2017), and both models regularly achieve very high accuracy on many prediction tasks. While there are studies employing XLNet (SU et al 2021;Yang et al 2019), previous research suggests that RoBERTa can achieve state-of-the-art results outperforming BERT and XLNet (Slovikovskaya 2019;Dulhanty et al 2019;Liu et al 2022;Barbieri et al 2020). Despite its strong performance, RoBERTa has a maximum input length when processing texts.…”
Section: Stance Detectionmentioning
confidence: 99%
“…Both are large language models based on the Transformer architecture (Vaswani et al 2017), and both models regularly achieve very high accuracy on many prediction tasks. While there are studies employing XLNet (SU et al 2021;Yang et al 2019), previous research suggests that RoBERTa can achieve state-of-the-art results outperforming BERT and XLNet (Slovikovskaya 2019;Dulhanty et al 2019;Liu et al 2022;Barbieri et al 2020). Despite its strong performance, RoBERTa has a maximum input length when processing texts.…”
Section: Stance Detectionmentioning
confidence: 99%