2021
DOI: 10.1055/s-0041-1731390
|View full text |Cite
|
Sign up to set email alerts
|

Semantic Textual Similarity in Japanese Clinical Domain Texts Using BERT

Abstract: Background Semantic textual similarity (STS) captures the degree of semantic similarity between texts. It plays an important role in many natural language processing applications such as text summarization, question answering, machine translation, information retrieval, dialog systems, plagiarism detection, and query ranking. STS has been widely studied in the general English domain. However, there exists few resources for STS tasks in the clinical domain and in languages other than English, such as Japanese. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 21 publications
(9 citation statements)
references
References 15 publications
0
9
0
Order By: Relevance
“…Instead of the traditional one-way language model or the method of shallow splicing of two one-way language models for pretraining, the algorithm adopts the new marked language model, which can generate deep two-way language representation and fine-tune specific downstream tasks [ 22 ]. At present, BERT has played an essential role in the research field of natural language processing, such as entity relationship extraction, text emotion analysis, and text classification [ 23 ]. Gao et al [ 24 ] proposed a medical relationship extraction model based on BERT, which combined the whole sentence information obtained from the pretrained language model with the corresponding information of two medical entities to complete the relationship extraction task.…”
Section: Related Workmentioning
confidence: 99%
“…Instead of the traditional one-way language model or the method of shallow splicing of two one-way language models for pretraining, the algorithm adopts the new marked language model, which can generate deep two-way language representation and fine-tune specific downstream tasks [ 22 ]. At present, BERT has played an essential role in the research field of natural language processing, such as entity relationship extraction, text emotion analysis, and text classification [ 23 ]. Gao et al [ 24 ] proposed a medical relationship extraction model based on BERT, which combined the whole sentence information obtained from the pretrained language model with the corresponding information of two medical entities to complete the relationship extraction task.…”
Section: Related Workmentioning
confidence: 99%
“…BERT consists of transformer encoder layers and is designed for language modeling and next-sentence prediction. It has been shown to powerfully predict semantic similarities between word-and sentence-pairs [51,58] and this makes it a potentially valuable tool for gesture generation as mappings from words to gestures can be generalized over synonyms [23,43]. However, BERT does not have in-built explainable AI, limiting its application to realizing designer intent.…”
Section: Large Language Modelsmentioning
confidence: 99%
“…The Bert model uses bidirectional transformers as encoders, which enhances the generalization ability of the word vector model, fully describes the relationships between characters, words, and sentences, and effectively represents the semantic information between contexts. The Bert model has become the mainstream model in the field of NLP [31,32].…”
Section: Related Workmentioning
confidence: 99%