2021 Fifth International Conference on I-Smac (IoT in Social, Mobile, Analytics and Cloud) (I-Smac) 2021
DOI: 10.1109/i-smac52330.2021.9640861
|View full text |Cite
|
Sign up to set email alerts
|

But how robust is RoBERTa actually?: A Benchmark of SOTA Transformer Networks for Sexual Harassment Detection on Twitter

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 11 publications
0
1
0
Order By: Relevance
“…However, RoBERTa also performs exceptionally well in sentiment analysis, thanks to its deeper language representations [39]. Furthermore, some researchers have suggested that RoBERTa exhibits advantages over other state-of-the-art Transformer architectures in tasks related to text classification and detection [40,41]. Subsequently, we define two fully connected layers, each comprising a linear layer and a ReLU activation function.…”
Section: Two-level Text Feature Extraction Modulementioning
confidence: 99%
“…However, RoBERTa also performs exceptionally well in sentiment analysis, thanks to its deeper language representations [39]. Furthermore, some researchers have suggested that RoBERTa exhibits advantages over other state-of-the-art Transformer architectures in tasks related to text classification and detection [40,41]. Subsequently, we define two fully connected layers, each comprising a linear layer and a ReLU activation function.…”
Section: Two-level Text Feature Extraction Modulementioning
confidence: 99%