2023
DOI: 10.1021/jacs.2c11420
|View full text |Cite
|
Sign up to set email alerts
|

MOFormer: Self-Supervised Transformer Model for Metal–Organic Framework Property Prediction

Abstract: Metal−organic frameworks (MOFs) are materials with a high degree of porosity that can be used for many applications. However, the chemical space of MOFs is enormous due to the large variety of possible combinations of building blocks and topology. Discovering the optimal MOFs for specific applications requires an efficient and accurate search over countless potential candidates. Previous high-throughput screening methods using computational simulations like DFT can be time-consuming. Such methods also require … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

1
48
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8
1
1

Relationship

2
8

Authors

Journals

citations
Cited by 64 publications
(49 citation statements)
references
References 65 publications
1
48
0
Order By: Relevance
“…There have been many applications of ML in the materials eld [11][12][13] and satisfactory prediction performance has been achieved on the properties of MOFs. 7,14 However, advanced machine learning models (e.g. deep neural networks) require a large amount of data for training to make convincing predictions, which is difficult to meet for certain types of materials, such as porphyrin-based MOFs.…”
Section: Introductionmentioning
confidence: 99%
“…There have been many applications of ML in the materials eld [11][12][13] and satisfactory prediction performance has been achieved on the properties of MOFs. 7,14 However, advanced machine learning models (e.g. deep neural networks) require a large amount of data for training to make convincing predictions, which is difficult to meet for certain types of materials, such as porphyrin-based MOFs.…”
Section: Introductionmentioning
confidence: 99%
“…Also, GEM proposes to pretrain GNNs via predicting 3D positional information, including interatomic distances, bond lengths, and bond angles. On the other hand, contrastive learning, which aims at learning representations via contrasting positive instances against negative instances, has been widely implemented in GNNs for chemical sciences. MolCLR applies random masking of atom and edge attributes to generate contrastive instances of molecular graphs. Recent works have also investigated multilevel subgraphs in contrastive training. , GraphMVP and 3D infomax incorporate 3D information into 2D graphs via contrasting molecules represented as 2D topological graphs and 3D geometric structures.…”
Section: Introductionmentioning
confidence: 99%
“…On the other hand, contrastive learning, which aims at learning representations via contrasting positive instances against negative instances, has been widely implemented in GNNs for chemical sciences. [54][55][56][57] MolCLR 58 applies random masking of atom and edge attributes to generate contrastive instances of molecular graphs. Recent works have also investigated multi-level subgraphs in contrastive training.…”
Section: Introductionmentioning
confidence: 99%