2022
DOI: 10.1007/978-3-031-22792-9_5
|View full text |Cite
|
Sign up to set email alerts
|

MACEDONIZER - The Macedonian Transformer Language Model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 13 publications
0
2
0
Order By: Relevance
“…Jacob Devlin and his colleagues from Google developed BERT and released it in 2018. Google announced in 2019 that it started using BERT in its search engine and by the end of 2020, BERT was being utilized in nearly all English-language queries [6]. The ParsBERT model, which was presented in [5], is a language model for a single language that employs Google's BERT architecture.…”
Section: History Of Bert and Parsbertmentioning
confidence: 99%
“…Jacob Devlin and his colleagues from Google developed BERT and released it in 2018. Google announced in 2019 that it started using BERT in its search engine and by the end of 2020, BERT was being utilized in nearly all English-language queries [6]. The ParsBERT model, which was presented in [5], is a language model for a single language that employs Google's BERT architecture.…”
Section: History Of Bert and Parsbertmentioning
confidence: 99%
“…Fast forward to today, decoder-based language models are most prominent in the field, with the OpenAI GPT models (now in the fourth generation) being especially popular for instruction tuning [10]. However, their last model published in open code (and also the latest one available for Serbian) is still GPT-2 [11], with the efforts still being focused mainly on the development of encoder-based models both for Serbian [12] and similar Slavic languages [13][14][15][16].…”
Section: State-of-the-artmentioning
confidence: 99%