Proceedings of the Conference Recent Advances in Natural Language Processing - Large Language Models for Natural Language Proce 2023
DOI: 10.26615/978-954-452-092-2_024
|View full text |Cite
|
Sign up to set email alerts
|

BERTabaporu: Assessing a Genre-specific Language Model for Portuguese NLP

Pablo da Costa,
Matheus Pavan,
Wesley dos Santos
et al.

Abstract: Transformer-based language models such as Bidirectional Encoder Representations from Transformers (BERT) are now mainstream in the NLP field, but extensions to languages other than English, to new domains and/or to more specific text genres are still in demand. In this paper we introduced BERTabaporu, a BERT language model that has been pre-trained on Twitter data in the Brazilian Portuguese language. The model is shown to outperform the best-known general-purpose model for this language in three Twitter-relat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
references
References 21 publications
0
0
0
Order By: Relevance