2023
DOI: 10.1016/j.patter.2023.100803
|View full text |Cite
|
Sign up to set email alerts
|

Accurate, interpretable predictions of materials properties within transformer language models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2024
2024
2025
2025

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 15 publications
(3 citation statements)
references
References 123 publications
0
3
0
Order By: Relevance
“…The lack of data for the end point of interest can be addressed by leveraging advanced techniques, including transfer learning and self-supervised learning. , To assess the potential usefulness of applying CG 2 -NNs in conjunction with one of such algorithms, we consider the task of band gap prediction across different classes of reticular materials: the models pretrained on the MOF band gap are used to evaluate the same property in COFs (Figure a, Table S3). The models trained from scratch, i.e., without pretraining, serve as a baseline.…”
Section: Resultsmentioning
confidence: 99%
“…The lack of data for the end point of interest can be addressed by leveraging advanced techniques, including transfer learning and self-supervised learning. , To assess the potential usefulness of applying CG 2 -NNs in conjunction with one of such algorithms, we consider the task of band gap prediction across different classes of reticular materials: the models pretrained on the MOF band gap are used to evaluate the same property in COFs (Figure a, Table S3). The models trained from scratch, i.e., without pretraining, serve as a baseline.…”
Section: Resultsmentioning
confidence: 99%
“…They are trained on a large amount of data with multiple modalities of concepts. They have been evolving interactively with large language models (LLMs) [19,20], which have also brought a significant impact on materials science [21][22][23][24][25][26][27][28][29]. FMs and LLMs often refer the same as natural language is so common for us, which should also be the case for materials science; however, FMs more specifically connect different modalities and are expected to open up ways of conceptual representation.…”
Section: Introductionmentioning
confidence: 99%
“…There have been several previous research works for forward prediction of material properties, such as descriptor-based methods, including models in MatMiner 14 and classical forcefield inspired descriptors (CFID); 15 graph-based methods, such as crystal graph convolutional neural networks (CGCNNs), 16 Materials Graph Network (MEGNet), 17 Atomistic Line Graph Neural Network (ALIGNN), 18 CoGN, 19 M3GNet, 20 Matformer, 21 and ComFormer; 22 and language-based methods, such as LLM-Prop 23 etc. Similarly, some of the earlier inverse design methods are Crystal Diffusional Variational Autoencoder (CDVAE), 24 Fourier Transformed Crystal Properties (FTCP), 25 G-SchNet, 26 MatBERT, 27 CrystaLLM, 28 Crystal-LLM, 29 and xyztransformer. 30 AtomGPT is a deep learning model 6 that leverages a few transformer architectures such as GPT2 31 and quantized Mistral-AI 32 to learn the complex relationships between atomic structures and material properties from datasets such as JARVIS-DFT.…”
mentioning
confidence: 99%