2021 IEEE 34th International Symposium on Computer-Based Medical Systems (CBMS) 2021
DOI: 10.1109/cbms52027.2021.00056
|View full text |Cite
|
Sign up to set email alerts
|

A GPT-2 Language Model for Biomedical Texts in Portuguese

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 24 publications
(12 citation statements)
references
References 15 publications
0
12
0
Order By: Relevance
“…At its core, GPT is founded upon the innovative Transformer architecture, a model that has revolutionized the field by effectively capturing long-term dependencies within sequences, making it exceptionally well-suited for tasks involving language understanding and generation [ 45 , 46 ]. The GPT family has multiple versions: The GPT-2 model, with 1.5 billion parameters, is capable of generating extensive sequences of text while adapting to the style and content of arbitrary inputs [ 47 ]. Moreover, GPT-2 can also perform various NLP tasks, such as classification [ 47 ].…”
Section: Methodsmentioning
confidence: 99%
“…At its core, GPT is founded upon the innovative Transformer architecture, a model that has revolutionized the field by effectively capturing long-term dependencies within sequences, making it exceptionally well-suited for tasks involving language understanding and generation [ 45 , 46 ]. The GPT family has multiple versions: The GPT-2 model, with 1.5 billion parameters, is capable of generating extensive sequences of text while adapting to the style and content of arbitrary inputs [ 47 ]. Moreover, GPT-2 can also perform various NLP tasks, such as classification [ 47 ].…”
Section: Methodsmentioning
confidence: 99%
“…Its successor, GPT-2, was also trained with a massive text data heap using a language-modeling task. However, unlike the previous version, it could generate longer text sequences consistent with human language [32].…”
Section: Openai and The Development Of Chatgptmentioning
confidence: 99%
“…Perceived importance (PIM) refers to students' general beliefs about technology and is linked to the cognitive component [22][23][24][25][26][27][28][29][30][31][32][33][34][35][36][37][38]. The perceived importance of a technological tool affects the intention to use it in the future [40].…”
Section: Hypotheses From the Constructs Of The Cognitive Component Of...mentioning
confidence: 99%
“…Models like the BERT, ELMO, and GPT-2 develop a rich context in terms of efficiency and predicting various interpretations. Schneider et al [103] proposed the GPT-2 to identify a Portuguese biomedical text. They used a fine-tuning approach for transfer learning; however, they manually annotated the public dataset for classification tasks.…”
Section: Gpt-2 Model For Contextual Word Embeddings (Cwe)mentioning
confidence: 99%