2019
DOI: 10.48550/arxiv.1910.09255
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Constructing Artificial Data for Fine-tuning for Low-Resource Biomedical Text Tagging with Applications in PICO Annotation

Abstract: Biomedical text tagging systems are plagued by the dearth of labeled training data. There have been recent attempts at using pre-trained encoders to deal with this issue. Pre-trained encoder provides representation of the input text which is then fed to task-specific layers for classification. The entire network is fine-tuned on the labeled data from the target task. Unfortunately, a low-resource biomedical task often has too few labeled instances for satisfactory fine-tuning. Also, if the label space is large… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 15 publications
0
1
0
Order By: Relevance
“…We broadly categorise existing approaches based on their modification method into input-related, external and internal. Input modifications Singh et al, 2020;Lai et al, 2020;Ruan et al, 2020) adapt the information that is fed to BERT -e.g. feeding text triples separated by [SEP] tokens instead of sentence pairs as in Lai et al (2020) -while leaving the architecture unchanged.…”
Section: Our Qualitative Analysis Provides Insights Intomentioning
confidence: 99%
“…We broadly categorise existing approaches based on their modification method into input-related, external and internal. Input modifications Singh et al, 2020;Lai et al, 2020;Ruan et al, 2020) adapt the information that is fed to BERT -e.g. feeding text triples separated by [SEP] tokens instead of sentence pairs as in Lai et al (2020) -while leaving the architecture unchanged.…”
Section: Our Qualitative Analysis Provides Insights Intomentioning
confidence: 99%