2022
DOI: 10.1016/j.procs.2022.09.122
|View full text |Cite
|
Sign up to set email alerts
|

Medical Named Entity Recognition using Surrounding Sequences Matching

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(2 citation statements)
references
References 20 publications
0
2
0
Order By: Relevance
“…In contrast, pre-trained models used self-supervised learning methods to learn deep language representations from large-scale corpora, acquiring richer and more precise semantic features [15]. Additionally, pre-trained models possessed strong generalization and transfer learning capabilities, enabling them to accommodate diverse domains and tasks [16]. When faced with a small amount of labeled data, they achieved good performance with only fine-tuning.…”
Section: Related Workmentioning
confidence: 99%
“…In contrast, pre-trained models used self-supervised learning methods to learn deep language representations from large-scale corpora, acquiring richer and more precise semantic features [15]. Additionally, pre-trained models possessed strong generalization and transfer learning capabilities, enabling them to accommodate diverse domains and tasks [16]. When faced with a small amount of labeled data, they achieved good performance with only fine-tuning.…”
Section: Related Workmentioning
confidence: 99%
“…That same year, three studies were published that utilized the bidirectional encoder representations from transformers (BERT) architecture [28,43,46]. By 2021 [47][48][49][50][51][52][53][54], the BERT architecture and its variants had emerged as the primary NER model applied to EHRs, a trend that continues to this day [4,24,25,[55][56][57][58][59][60][61][62][63][64][65][66][67][68][69][70][71]. However, this self-attention mechanism was initially introduced in 2017 [72].…”
Section: Classification Modelsmentioning
confidence: 99%