2016
DOI: 10.1038/srep34256
|View full text |Cite
|
Sign up to set email alerts
|

A Statistical Learning Framework for Materials Science: Application to Elastic Moduli of k-nary Inorganic Polycrystalline Compounds

Abstract: Materials scientists increasingly employ machine or statistical learning (SL) techniques to accelerate materials discovery and design. Such pursuits benefit from pooling training data across, and thus being able to generalize predictions over, k-nary compounds of diverse chemistries and structures. This work presents a SL framework that addresses challenges in materials science applications, where datasets are diverse but of modest size, and extreme values are often of interest. Our advances include the applic… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

4
178
1

Year Published

2017
2017
2022
2022

Publication Types

Select...
8
2

Relationship

2
8

Authors

Journals

citations
Cited by 230 publications
(183 citation statements)
references
References 71 publications
4
178
1
Order By: Relevance
“…S5). Recently, Jong et al [33] developed a statistical learning (SL) framework using multivariate local regression on crystal descriptors to predict elastic properties using the same data from the Materials Project. By using the same number of training data, our model achieves root mean squared error (RMSE) on test sets of 0.105 log(GPa) and 0.127 log(GPa) for the bulk and shear moduli, which is similar to the RMSE of SL on the entire data set of 0.0750 log(GPa) and 0.1378 log(GPa).…”
mentioning
confidence: 99%
“…S5). Recently, Jong et al [33] developed a statistical learning (SL) framework using multivariate local regression on crystal descriptors to predict elastic properties using the same data from the Materials Project. By using the same number of training data, our model achieves root mean squared error (RMSE) on test sets of 0.105 log(GPa) and 0.127 log(GPa) for the bulk and shear moduli, which is similar to the RMSE of SL on the entire data set of 0.0750 log(GPa) and 0.1378 log(GPa).…”
mentioning
confidence: 99%
“…A significant body of work has demonstrated that ML models can be trained to predict the results of first principles DFT calculations without the need for actual computationally intensive computations beyond the calculation of the reference data set. To name just a few examples, Rupp et al used kernel ridge regression (KRR) to train a model for the prediction of DFT molecular atomization energies [39], Jong et al used a gradient boosting approach to train DFT elastic moduli [40], Faber et al used a KRR model to predict DFT formation energies of Elpasolite (ABC 2 D 6 ) compositions [41], and Ye et al trained a deep ANN to predict DFT crystal stability [42]. Isayev et al proposed a model trained to DFT references with the gradient boosting decision tree technique to predict various properties of crystal structures including band gaps, elastic properties, and heat capacities [43].…”
Section: Progress In Machine Learning Methods For Materials Simulationsmentioning
confidence: 99%
“…Previous studies by de Jong et al [79] and Tehrani et al [71] applied statistical learning techniques to smaller -yet similarly diverse-elastic datasets, using sets of compositional and structural descriptors. In similar fashion, we generate basic statistics from the composition and structure of each material which we use as the optimization algorithm's z vector.…”
Section: Application To the Materials Science Domain: Photocatalysismentioning
confidence: 99%