2019
DOI: 10.3390/info10080248
|View full text |Cite
|
Sign up to set email alerts
|

Transfer Learning for Named Entity Recognition in Financial and Biomedical Documents

Abstract: Recent deep learning approaches have shown promising results for named entity recognition (NER). A reasonable assumption for training robust deep learning models is that a sufficient amount of high-quality annotated training data is available. However, in many real-world scenarios, labeled training data is scarcely present. In this paper we consider two use cases: generic entity extraction from financial and from biomedical documents. First, we have developed a character based model for NER in financial docume… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
17
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 33 publications
(20 citation statements)
references
References 14 publications
0
17
0
Order By: Relevance
“…CNN YOLO is also used for object recognition and feature extraction in complex documents such as comics [111]. • BERT: Few studies [83], [13] reported the use of BERT for extracting features from the data. It is suitable for various types of language processing related task in NLP.…”
Section: ) Named Entity Recognition (Ner)mentioning
confidence: 99%
See 1 more Smart Citation
“…CNN YOLO is also used for object recognition and feature extraction in complex documents such as comics [111]. • BERT: Few studies [83], [13] reported the use of BERT for extracting features from the data. It is suitable for various types of language processing related task in NLP.…”
Section: ) Named Entity Recognition (Ner)mentioning
confidence: 99%
“…Though BERT is a powerful NLP model, using it for NER without fine-tuning it on the NER dataset will not give better results. Fine-tuning BERT for NER performance improvement in financial and biomedical documents utilizing the combination of BERT and word embedding is discussed in the study [83]. Utilizing and fine-tuning the BERT model for achieving document-level relation extraction using DocRED-a large scale open-domain document-level relation extraction dataset shows improvement in F1 measure in [13].…”
Section: ) Named Entity Recognition (Ner)mentioning
confidence: 99%
“…The benefits of TL have previously been investigated for the purposes of biomedical NER (Sun and Yang, 2019;Francis et al, 2019) and RE Peng et al, 2019;Hafiane et al, 2020). Recent work has been aimed at solving the challenges of imbalanced relation distribution, linguistic variation and partial transfer using relationgated adversarial learning , and capturing dependency tree information using TreeLSTM models (Legrand et al, 2018).…”
Section: Related Workmentioning
confidence: 99%
“…It also has a rapid development in named entity recognition [28,29]. In the cross-language NER tasks [30] or cross-domain dataset NER tasks [31], the method based on transfer learning has been well proved to be usable.…”
Section: Related Workmentioning
confidence: 99%