In data-driven materials design where the target materials have limited data, the transfer machine learning from large known source materials, becomes a demanding strategy especially across different crystal structures. In this work, we proposed a deep transfer learning approach to predict thermodynamically stable perovskite oxides based on a large computational dataset of spinel oxides. The deep neural network (DNN) source domain model with “Center-Environment” (CE) features was first developed using the formation energy of 5329 spinel oxide structures and then was fine-tuned by learning a small dataset of 855 perovskite oxide structures, leading to a transfer learning model with good transferability in the target domain of perovskite oxides. Based on the transferred model, we further predicted the formation energy of potential 5329 perovskite structures with combination of 73 elements. Combining the criteria of formation energy and structure factors including tolerance factor (0.7 < t ≤ 1.1) and octahedron factor (0.45 < μ < 0.7), we predicted 1314 thermodynamically stable perovskite oxides, among which 144 oxides were reported to be synthesized experimentally, 10 oxides were predicted computationally by other literatures, 301 oxides were recorded in the Materials Project database, and 859 oxides have been first reported. Combing with the structure-informed features the transfer machine learning approach in this work takes the advantage of existing data to predict new structures at a lower cost, providing an effective acceleration strategy for the expensive high-throughput computational screening in materials design. The predicted stable novel perovskite oxides serve as a rich platform for exploring potential renewable energy and electronic materials applications.
We introduce an end-to-end computational framework 
that allows for hyperparameter optimization 
using the DeepHyper library, 
accelerated model training, and interpretable 
AI inference. The framework is based on state-of-the-art AI models
including CGCNN, 
PhysNet, SchNet, MPNN, 
MPNN-transformer, and TorchMD-NET. 
We employ these AI models along with the benchmark QM9, hMOF, 
and MD17 datasets to showcase how the models can predict user-specified material properties within modern 
computing environments. We demonstrate 
transferable applications in the modeling 
of small molecules, inorganic crystals and nanoporous metal organic 
frameworks with a unified, standalone framework. 
We have deployed and tested this framework
in the ThetaGPU supercomputer 
at the Argonne Leadership Computing Facility, 
and in the Delta supercomputer at the National Center for 
Supercomputing Applications to 
provide researchers with modern tools to 
conduct accelerated AI-driven 
discovery in leadership-class computing environments. 
We release these digital assets as 
open source scientific software in GitLab, and 
ready-to-use Jupyter notebooks in Google Colab.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.