2021
DOI: 10.1093/bib/bbab317
|View full text |Cite
|
Sign up to set email alerts
|

Mol2Context-vec: learning molecular representation from context awareness for drug discovery

Abstract: With the rapid development of proteomics and the rapid increase of target molecules for drug action, computer-aided drug design (CADD) has become a basic task in drug discovery. One of the key challenges in CADD is molecular representation. High-quality molecular expression with chemical intuition helps to promote many boundary problems of drug discovery. At present, molecular representation still faces several urgent problems, such as the polysemy of substructures and unsmooth information flow between atomic … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 27 publications
(16 citation statements)
references
References 52 publications
0
16
0
Order By: Relevance
“…To reduce the computational cost, several studies adopted multitask learning to predict multiple QMPs simultaneously. 16,29,30 For example, MoleculeNet 16 compared several machine learning methods trained in the multitask setting for QMPs prediction and found that graph-based methods outperform traditional machine learning methods relying on hand-crafted features. Attentive FP 29 introduced a novel GNN that uses graph attention mechanisms at both the atom and molecule levels to learn both local and nonlocal properties of a given chemical structure.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…To reduce the computational cost, several studies adopted multitask learning to predict multiple QMPs simultaneously. 16,29,30 For example, MoleculeNet 16 compared several machine learning methods trained in the multitask setting for QMPs prediction and found that graph-based methods outperform traditional machine learning methods relying on hand-crafted features. Attentive FP 29 introduced a novel GNN that uses graph attention mechanisms at both the atom and molecule levels to learn both local and nonlocal properties of a given chemical structure.…”
Section: Related Workmentioning
confidence: 99%
“…We compared the proposed method with the existing state-ofthe-art (SOTA) multitask learning models ECFP, 16 CM, 16 GC, 16 DTNN, 50 MPNN, 12 AttentiveFP, 29 and Mol2Context. 30 Note that we excluded methods 14,15 that were implemented in single task settings in this group of experiments. Alternatively, we tailored the SOTA single task learning model TrimNet 15 to a multitask learning model by modifying the output single unit into multiple units.…”
Section: Compare With a State-of-the-art Multitask Learning Modelmentioning
confidence: 99%
See 2 more Smart Citations
“…Recurrent Neural Network (RNN) is useful for learning relational data or capturing sequential/temporal information because output from the previous state is fed into the current state. Similar to CNNs, RNNs have also been widely used to investigate sequentially formalized molecular representations (e.g., SMILES) [106] , [127] . Compared to CNN (See Section 3.1 ), RNN and its variants (e.g., LSTMs or GRUs) can capture long-range relationships among chemical elements in SMILES due to their innate recurrent memory mechanism.…”
Section: Deep Learning Technologies: How Well Can We Accomplish the T...mentioning
confidence: 99%