2022
DOI: 10.1186/s12859-022-05062-6
|View full text |Cite
|
Sign up to set email alerts
|

Long-distance dependency combined multi-hop graph neural networks for protein–protein interactions prediction

Abstract: Background Protein–protein interactions are widespread in biological systems and play an important role in cell biology. Since traditional laboratory-based methods have some drawbacks, such as time-consuming, money-consuming, etc., a large number of methods based on deep learning have emerged. However, these methods do not take into account the long-distance dependency information between each two amino acids in sequence. In addition, most existing models based on graph neural networks only agg… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
10
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(10 citation statements)
references
References 41 publications
0
10
0
Order By: Relevance
“…Graph representation learning (GRL) has numerous applications in drug discovery [Xiong et al, 2019], knowledge graph completion , and recommender systems [Ying et al, 2018]. The key downstream tasks in GRL are node classification , link prediction [Zhang and ] and graph classification .…”
Section: Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…Graph representation learning (GRL) has numerous applications in drug discovery [Xiong et al, 2019], knowledge graph completion , and recommender systems [Ying et al, 2018]. The key downstream tasks in GRL are node classification , link prediction [Zhang and ] and graph classification .…”
Section: Related Workmentioning
confidence: 99%
“…Recently, a new research direction has emerged in GNNs on scalable training and inference techniques, which mainly focuses on node classification tasks. A common practice is to use different sampling techniques-such as node sampling , layer-wise sampling [Zou et al, 2019], and graph sampling [Zeng et al, 2020]-to subsample the training data to reduce computation. Other approaches utilize simplification techniques, such as removing intermediate non-linearities [Wu et al, 2019;, to speed up the training and inference.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations