2018
DOI: 10.1186/s12859-018-2275-2
|View full text |Cite
|
Sign up to set email alerts
|

Biomedical event extraction based on GRU integrating attention mechanism

Abstract: BackgroundBiomedical event extraction is a crucial task in biomedical text mining. As the primary forum for international evaluation of different biomedical event extraction technologies, BioNLP Shared Task represents a trend in biomedical text mining toward fine-grained information extraction (IE). The fourth series of BioNLP Shared Task in 2016 (BioNLP-ST’16) proposed three tasks, in which the Bacteria Biotope event extraction (BB) task has been put forward in the earlier BioNLP-ST. Deep learning methods pro… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
57
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 34 publications
(58 citation statements)
references
References 12 publications
0
57
1
Order By: Relevance
“…While the dependency paths only connect a single word to others within the same sentence (intra-sentence), there are some cross-sentence (inter-sentence) associations that can be very challenging in terms of the extraction task. In order to compare with other existing works [5,15,16,17,18], only intra-sentence relations were considered.…”
Section: Text Preprocessingmentioning
confidence: 99%
See 4 more Smart Citations
“…While the dependency paths only connect a single word to others within the same sentence (intra-sentence), there are some cross-sentence (inter-sentence) associations that can be very challenging in terms of the extraction task. In order to compare with other existing works [5,15,16,17,18], only intra-sentence relations were considered.…”
Section: Text Preprocessingmentioning
confidence: 99%
“…DET-BLSTM [17] applied bidirectional LSTM (BLSTM) with a dynamic extended tree (DET) adapted from SDPs and achieved an F1 score of 57.14%. Recently, BGRU-Attn [18] proposed bidirectional gated recurrent unit (BGRU) with attention mechanism and domain-oriented distributed word representation. Consequently, it became the state-of-the-art DL system without hand-designed features for the BB task with an F1 score of 57.42%.…”
mentioning
confidence: 99%
See 3 more Smart Citations