2022
DOI: 10.48550/arxiv.2206.12241
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

CoSP: Co-supervised pretraining of pocket and ligand

Abstract: Can we inject the pocket-ligand interaction knowledge into the pre-trained model and jointly learn their chemical space? Pretraining molecules and proteins has attracted considerable attention in recent years, while most of these approaches focus on learning one of the chemical spaces and lack the injection of biological knowledge. We propose a co-supervised pretraining (CoSP) framework to simultaneously learn 3D pocket and ligand representations. We use a gated geometric message passing layer to model both 3D… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 46 publications
0
1
0
Order By: Relevance
“…More recently, some hybrid architectures (Rong et al, 2020;Ying et al, 2021;Min et al, 2022) of GNNs and transformers are emerging to capture the topological structures of molecular graphs. Additionally, given that the available labels for molecules are often expensive or incorrect (Xia et al, 2021;Tan et al, 2021;Xia et al, 2022a), the emerging self-supervised pre-training strategies (You et al, 2020;Xia et al, 2022c;Yue et al, 2022;Liu et al, 2023) on graph-structured data are promising for molecular graph data (Hu et al, 2020;Xia et al, 2023a;Gao et al, 2022), just like the overwhelming success of pre-trained language models in natural language processing community (Devlin et al, 2019;Zheng et al, 2022).…”
Section: C3 2d and 3d Graph-based Molecular Descriptorsmentioning
confidence: 99%
“…More recently, some hybrid architectures (Rong et al, 2020;Ying et al, 2021;Min et al, 2022) of GNNs and transformers are emerging to capture the topological structures of molecular graphs. Additionally, given that the available labels for molecules are often expensive or incorrect (Xia et al, 2021;Tan et al, 2021;Xia et al, 2022a), the emerging self-supervised pre-training strategies (You et al, 2020;Xia et al, 2022c;Yue et al, 2022;Liu et al, 2023) on graph-structured data are promising for molecular graph data (Hu et al, 2020;Xia et al, 2023a;Gao et al, 2022), just like the overwhelming success of pre-trained language models in natural language processing community (Devlin et al, 2019;Zheng et al, 2022).…”
Section: C3 2d and 3d Graph-based Molecular Descriptorsmentioning
confidence: 99%