2024
DOI: 10.1016/j.cmpb.2023.108003
|View full text |Cite
|
Sign up to set email alerts
|

TransVAE-DTA: Transformer and variational autoencoder network for drug-target binding affinity prediction

Changjian Zhou,
Zhongzheng Li,
Jia Song
et al.
Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 17 publications
0
8
0
Order By: Relevance
“…In the future, it is imperative to delve into DTA prediction methods based on deep learning from three key perspectives: Performance analysis of multiple state-of-the-art methods based on KIBA dataset. The evaluation metric values of these methods in the figure are sourced from References (Bi et al, 2023;Xia et al, 2023;Tian et al, 2024;Wu et al, 2024;Zhou et al, 2024). Performance analysis of multiple state-of-the-art methods based on Davis dataset.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…In the future, it is imperative to delve into DTA prediction methods based on deep learning from three key perspectives: Performance analysis of multiple state-of-the-art methods based on KIBA dataset. The evaluation metric values of these methods in the figure are sourced from References (Bi et al, 2023;Xia et al, 2023;Tian et al, 2024;Wu et al, 2024;Zhou et al, 2024). Performance analysis of multiple state-of-the-art methods based on Davis dataset.…”
Section: Discussionmentioning
confidence: 99%
“…Figure 1 highlights PDBbind, KIBA, and Davis datasets as commonly used datasets for predicting DTA using deep learning. We summarized the performance evaluation metrics values of several state-of-the-art methods on PDBbind, KIBA, and Davis datasets, as reported in recently published literatures ( Wang et al, 2023a ; Zhu et al, 2023a ; Bi et al, 2023 ; Xia et al, 2023 ; Tian et al, 2024 ; Wu et al, 2024 ; Zhou et al, 2024 ), without considering the specific partitioning of the corresponding datasets by these methods. Although the statistical results ( Tables 4 , 5 ; Figures 5 – 7 ) showed that these methods have achieved good prediction performance for DTA on commonly used benchmark datasets, the further improvement in DTA prediction still faces challenges.…”
Section: Performance Analysis Of Multiple State-of-the-art Methods Ba...mentioning
confidence: 99%
See 2 more Smart Citations
“…Evaluated interpretability of downstream DTA predictions are obtained from the attention scores. To better capture structural information, which is essential for DTA, several recent methods such as GTAMP-DTA, 117 AttentionMGT-DTA, 118 and TransVAE-DTA 119 turn to graph representations, representing molecules and proteins as molecular graph and protein pocket graphs and processing them via graph transformers. These models obtain strong performances on DTA predictions.…”
Section: Applications Of Transformers In Cheminformaticsmentioning
confidence: 99%