2019
DOI: 10.1016/j.jbi.2019.103156
|View full text |Cite
|
Sign up to set email alerts
|

Enhancing metabolic event extraction performance with multitask learning concept

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 20 publications
0
6
0
Order By: Relevance
“…In the formal case, where n tasks or subsets of them have relationship to each other (conventional deep learning approaches seek to solve only one task with one particular model), but not exactly the same ones, MTL can enhance the learning of a specific model by using the information contained in all n tasks (Wu et al , 2020). This research inspired and laid the groundwork for much of MTL's work, by showing feasible and significant early results (Kongburan et al , 2019). Crichton et al (2017) had presented three models, which were multi-output multitask model, MTM and STM, for BioNER.…”
Section: Related Workmentioning
confidence: 97%
“…In the formal case, where n tasks or subsets of them have relationship to each other (conventional deep learning approaches seek to solve only one task with one particular model), but not exactly the same ones, MTL can enhance the learning of a specific model by using the information contained in all n tasks (Wu et al , 2020). This research inspired and laid the groundwork for much of MTL's work, by showing feasible and significant early results (Kongburan et al , 2019). Crichton et al (2017) had presented three models, which were multi-output multitask model, MTM and STM, for BioNER.…”
Section: Related Workmentioning
confidence: 97%
“…The dataset n2c2/OHNLP Track on Clinical Semantic Textual Similarity (Clini-calSTS) 4 provides pairs of clinical text fragments, which are unrecognizable sentences extracted from clinical notes. The task is to assign a numerical score to each pair of sentences to express their semantic similarity.…”
Section: Datasetsmentioning
confidence: 99%
“…Embedding words in a continuous semantic space has an important impact on many NLP tasks [4][5][6]. Mikolov et al [7] used word co-occurrence to train word vectors iteratively and proposed the Word2Vec model.…”
mentioning
confidence: 99%
“…Besides event extraction, joint classification models can also be trained with other NLP tasks, like named entity recognition, event co-reference resolution, event relation extraction and etc. Some researchers have proposed to train a single model to jointly execute these tasks [128]- [132]. For example, Li et al [128] proposed a framework to jointly execute these tasks together with event extraction in a single model.…”
Section: Joint Classification Modelmentioning
confidence: 99%