2021
DOI: 10.1101/2021.11.29.470486
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

GraphPrompt: Biomedical Entity Normalization Using Graph-based Prompt Templates

Abstract: Biomedical entity normalization unifies the language across biomedical experiments and studies, and further enables us to obtain a holistic view of life sciences. Current approaches mainly study the normalization of more standardized entities such as diseases and drugs, while disregarding the more ambiguous but crucial entities such as pathways, functions and cell types, hindering their real-world applications. To achieve biomedical entity normalization on these under-explored entities, we first introduce an e… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 34 publications
0
2
0
Order By: Relevance
“…Prompt-Based Learning. In recent two years, prompt-based learning methods [42,43,45] have shown strong capabilities to improve the downstream tasks of NLP. Schick et al [42] utilize prompt templates to provide label hints of models for classification and generation language tasks.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Prompt-Based Learning. In recent two years, prompt-based learning methods [42,43,45] have shown strong capabilities to improve the downstream tasks of NLP. Schick et al [42] utilize prompt templates to provide label hints of models for classification and generation language tasks.…”
Section: Related Workmentioning
confidence: 99%
“…In this way, structural feature learning of graphs can be enhanced and may benefit downstream tasks. Recently, prompt-based tuning methods [42,43,44,45] in natural language processing (NLP) represent a powerful performance by bridging gaps between the pre-training stage and downstream task fine-tuning stages. For instance, given an input sentence of "The story of the movie is well arranged.…”
Section: Introductionmentioning
confidence: 99%