2021
DOI: 10.2196/22795
|View full text |Cite
|
Sign up to set email alerts
|

Adapting Bidirectional Encoder Representations from Transformers (BERT) to Assess Clinical Semantic Textual Similarity: Algorithm Development and Validation Study

Abstract: Background Natural Language Understanding enables automatic extraction of relevant information from clinical text data, which are acquired every day in hospitals. In 2018, the language model Bidirectional Encoder Representations from Transformers (BERT) was introduced, generating new state-of-the-art results on several downstream tasks. The National NLP Clinical Challenges (n2c2) is an initiative that strives to tackle such downstream tasks on domain-specific clinical data. In this paper, we presen… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
12
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(12 citation statements)
references
References 25 publications
0
12
0
Order By: Relevance
“…It has often been reported that BERT exhibits high performance, even with clinical text [ 36 - 39 ]. This is also true for this study, in which a model combining BERT and Bi-LSTM using clinical text recorded in daily practice allowed for fall prediction with an accuracy equal to or higher than that of conventional risk assessment tools.…”
Section: Discussionmentioning
confidence: 99%
“…It has often been reported that BERT exhibits high performance, even with clinical text [ 36 - 39 ]. This is also true for this study, in which a model combining BERT and Bi-LSTM using clinical text recorded in daily practice allowed for fall prediction with an accuracy equal to or higher than that of conventional risk assessment tools.…”
Section: Discussionmentioning
confidence: 99%
“…Therefore, it is suitable for evaluating the relationship between two sentences such as similarity. However, previous research [9] showed that Bioclinical BERT cannot handle well with prescriptions where less semantic information can be reflected. The author proposed a medication graph to enhance the backbone BERT model, but the graph is constructed from local data and cannot indicate relationships of unseen drug pairs objectively.…”
Section: Related Workmentioning
confidence: 97%
“…End-to-end deep learning-based models with fine tuning are the mainstream in this domain. State-of-the-art models like BERT [8] and its variants, and XLNet [18], are modified and involved in similarity calculating with task-specific fine tuning [9], [19]. Also, multitask learning (MTL) paradigm is widely used and verified to learn powerful representations from multiple data [20].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Bidirectional encoder representations from transformers (BERT) is a contextualized embedding method that preserves the distance of meanings with multihead attention [ 17 ]. After pretrained on the relevant corpora and proper architecture modification, BERT extracts meaningful embeddings from clinical text [ 18 , 19 ].…”
Section: Introductionmentioning
confidence: 99%