2021
DOI: 10.48550/arxiv.2109.15254
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

SlovakBERT: Slovak Masked Language Model

Abstract: We introduce a new Slovak masked language model called SlovakBERT in this paper. It is the first Slovak-only transformers-based model trained on a sizeable corpus. We evaluate the model on several NLP tasks and achieve state-of-the-art results. We publish the masked language model, as well as the subsequently fine-tuned models for part-of-speech tagging, sentiment analysis and semantic textual similarity.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 21 publications
0
1
0
Order By: Relevance
“…Another interesting study is from the Slovak researchers M. Pikuliak et al [14], who created a language model called SlovakBERT. This model has RoBERTa architecture (Liu et al [15]) and it is trained on a web-crawled corpus.…”
Section: It Is Awesomementioning
confidence: 99%
“…Another interesting study is from the Slovak researchers M. Pikuliak et al [14], who created a language model called SlovakBERT. This model has RoBERTa architecture (Liu et al [15]) and it is trained on a web-crawled corpus.…”
Section: It Is Awesomementioning
confidence: 99%