2022
DOI: 10.1021/acsomega.1c06389
|View full text |Cite
|
Sign up to set email alerts
|

Describe Molecules by a Heterogeneous Graph Neural Network with Transformer-like Attention for Supervised Property Predictions

Abstract: Machine learning and deep learning have facilitated various successful studies of molecular property predictions. The rapid development of natural language processing and graph neural network (GNN) further pushed the state-of-the-art prediction performance of molecular property to a new level. A geometric graph could describe a molecular structure with atoms as the nodes and bonds as the edges. Therefore, a graph neural network may be trained to better represent a molecular structure. The existing GNNs assumed… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(3 citation statements)
references
References 32 publications
0
3
0
Order By: Relevance
“…The molecular features in this work consist of two parts: molecular fingerprints and molecular graph representations. Molecular fingerprints typically provide information about various properties of molecules, such as general physical properties, electrochemical properties, and electron cloud characteristics [27][28][29][30] . In this work, MACCS fingerprints are utilized to encode molecular information.…”
Section: Molecular Feature Selectionmentioning
confidence: 99%
“…The molecular features in this work consist of two parts: molecular fingerprints and molecular graph representations. Molecular fingerprints typically provide information about various properties of molecules, such as general physical properties, electrochemical properties, and electron cloud characteristics [27][28][29][30] . In this work, MACCS fingerprints are utilized to encode molecular information.…”
Section: Molecular Feature Selectionmentioning
confidence: 99%
“…The infusion of Transformers into molecular property prediction research signals the dawn of a new era, signifying a substantial leap forward. Models such as ABT-MPNN and TranGRU have demonstrated the transformative potential of Transformers in enhancing the understanding of molecular information [ 146–148 ]. ABT-MPNN , by seamlessly integrating the self-attention mechanism with MPNNs, refines molecular representation embedding, achieving competitive or superior performance across various datasets in quantitative structure–property relationship tasks.…”
Section: Applications Of Attention-based Models In Drug Discoverymentioning
confidence: 99%
“…By harnessing and fully utilizing this wealth of data, we can undoubtedly accelerate the development process of OSC materials. In recent years, the utilization of graph neural network (GNN) models has gained significant traction in the field of materials science. , These models have been applied in various aspects, including the accurate prediction of material properties with minimal computational resources. By leveraging the power of GNNs, researchers can efficiently explore and analyze the structure–properties relationships, shedding light on the underlying mechanisms that govern material behavior. , Additionally, GNNs have been instrumental in the development of new materials that are specifically tailored to exhibit desired properties, opening up new possibilities for designing materials with targeted functionalities. Indeed, the crystal graph convolutional neural network (CGCNN) is a specialized GNN that is designed specifically for processing and predicting properties of crystalline materials . The CGCNN employs a residual structure to effectively propagate information across multiple channels, thereby addressing the issue of gradient vanishing.…”
Section: Introductionmentioning
confidence: 99%