2021
DOI: 10.1093/bioinformatics/btab715
|View full text |Cite
|
Sign up to set email alerts
|

HyperAttentionDTI: improving drug–protein interaction prediction by sequence-based deep learning with attention mechanism

Abstract: Motivation Identifying drug–target interactions (DTIs) is a crucial step in drug repurposing and drug discovery. Accurately identifying DTIs in silico can significantly shorten development time and reduce costs. Recently, many sequence-based methods are proposed for DTI prediction and improve performance by introducing the attention mechanism. However, these methods only model single non-covalent inter-molecular interactions among drugs and proteins and ignore the complex interaction between … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
63
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 108 publications
(63 citation statements)
references
References 26 publications
0
63
0
Order By: Relevance
“…The attention mechanisms allow the network to focus on the most relevant parts of the input and have been proven to be useful for various tasks. 4,36 To capture significant information from both the SMILES sequence and molecular graph, we designed an attention mechanism, called Graph–Sequence Attention. Specifically, given the SMILES representation S i ∈ R N and the graph representation H i ∈ R N , we transform S i and H i into the vectors d S i and d H i through formula (2) for feature extraction and attention modeling: where W 1 ∈ R N and W 2 ∈ R N are trainable parameters, and b represents the bias vector.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The attention mechanisms allow the network to focus on the most relevant parts of the input and have been proven to be useful for various tasks. 4,36 To capture significant information from both the SMILES sequence and molecular graph, we designed an attention mechanism, called Graph–Sequence Attention. Specifically, given the SMILES representation S i ∈ R N and the graph representation H i ∈ R N , we transform S i and H i into the vectors d S i and d H i through formula (2) for feature extraction and attention modeling: where W 1 ∈ R N and W 2 ∈ R N are trainable parameters, and b represents the bias vector.…”
Section: Methodsmentioning
confidence: 99%
“…1,2 The effective prediction of drug-target binding affinity (DTA) is one of the signicant issues in drug discovery. [3][4][5] Drugs are usually represented as a string obtained from the simplied molecular-input lineentry system (SMILES) 6 or represented by a molecule graph with atoms as nodes and chemical bonds as edges. Targets (or proteins) are sequences of amino acids.…”
Section: Introductionmentioning
confidence: 99%
“…Up-sampling Drugbank Dataset: While the associations between the drug and targets constitute the positive set, there is no negative set. To address the negative set issue, we pair random drug-target associations as negative set, similar to previous works [3], [7], [17]. This leads to a high-imbalance dataset depending on number of random parings being considered for each drug or target.…”
Section: Loss Functionsmentioning
confidence: 99%
“…For proteins represented as FASTA sequence, we use convolutional neural network for learning representations. Therefore, unlike previous works [2], [3], [7], [13], [17], [18], GraMDTA learns representations for drugs and proteins from structures and their corresponding knowledge graphs. We aggregate the multiple modalities of drugs and proteins using multi-head attention weighting mechanism to learn relevant information while eliminating the noisy information cascades.…”
Section: Introductionmentioning
confidence: 99%
“…Many deep learning models have been proposed to use protein sequences as input [19][20][21][22][23][24][25][26][27][28] , but none have thoroughly verified the concept of the sequence-to-drug paradigm. In this work, we address the issue for the first time through three stages (Fig.…”
Section: Introductionmentioning
confidence: 99%