2017
DOI: 10.1093/bioinformatics/btx659
|View full text |Cite
|
Sign up to set email alerts
|

Drug–drug interaction extraction via hierarchical RNNs on sequence and shortest dependency paths

Abstract: MotivationAdverse events resulting from drug-drug interactions (DDI) pose a serious health issue. The ability to automatically extract DDIs described in the biomedical literature could further efforts for ongoing pharmacovigilance. Most of neural networks-based methods typically focus on sentence sequence to identify these DDIs, however the shortest dependency path (SDP) between the two entities contains valuable syntactic and semantic information. Effectively exploiting such information may improve DDI extrac… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
106
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
3
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 145 publications
(107 citation statements)
references
References 26 publications
(41 reference statements)
1
106
0
Order By: Relevance
“…Afterward, we concatenated two sentence embeddings and fed them into an architecture with one dense layer to predict the similarity of two sentences. ; BC5CDR-disease, BC5CDR-chem (Yoon et al, 2018); ShARe/CLEFE (Leaman et al, 2015); DDI (Zhang et al, 2018). Chem-Prot (Peng et al, 2018); i2b2 (Rink et al, 2011); HoC (Du et al, 2019); MedNLI (Romanov and Shivade, 2018).…”
Section: Fine-tuning With Elmomentioning
confidence: 99%
“…Afterward, we concatenated two sentence embeddings and fed them into an architecture with one dense layer to predict the similarity of two sentences. ; BC5CDR-disease, BC5CDR-chem (Yoon et al, 2018); ShARe/CLEFE (Leaman et al, 2015); DDI (Zhang et al, 2018). Chem-Prot (Peng et al, 2018); i2b2 (Rink et al, 2011); HoC (Du et al, 2019); MedNLI (Romanov and Shivade, 2018).…”
Section: Fine-tuning With Elmomentioning
confidence: 99%
“…We ran the model 5 times with different random seeds and then calculated the average performance [62]. The state-of-the-art (SOTA) model by Zhang and colleagues achieved an F1-score of 0.73 on this dataset [63]. Their model uses an LSTM as an encoder with an attention mechanism and outperforms other feature-based, kernel-based, and neural networks-based methods.…”
Section: Extrinsic Evaluation Resultsmentioning
confidence: 99%
“…Attention mechanisms have recently been successfully applied to biomedical relation extraction tasks [14,18,30]. These attention networks are able to learn a vector of important weights for each word in a sentence to reflect its impact on the final result.…”
Section: Attention Mechanismsmentioning
confidence: 99%
“…Unlike the feature-based models, DL models demand less feature engineering because they can automatically learn useful features from training data. Examples of popular DL models that have successfully been applied for biomedical relation extraction include Convolutional Neural Networks (CNNs) [9,10,11,12] and Recurrent Neural Networks (RNNs) [13,14].…”
mentioning
confidence: 99%