2019
DOI: 10.1103/physrevmaterials.3.063802
|View full text |Cite
|
Sign up to set email alerts
|

Stochastic replica voting machine prediction of stable cubic and double perovskite materials and binary alloys

Abstract: A machine learning approach that we term the "Stochastic Replica Voting Machine" (SRVM) algorithm is presented and applied to a binary and a 3-class classification problems in materials science. Here, we employ SRVM to predict candidate compounds capable of forming stable perovskites and double perovskites and further classify binary (AB) solids. The results of our binary and ternary classifications compared well to those obtained by SVM and neural network algorithms.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 67 publications
0
6
0
Order By: Relevance
“…We remark that the use of multiple classifiers (different from our stochastic functions) to enhance accuracy further appears in other machine learning approaches such as those of unweighted "bagging" [64] or more sophisticated "boosting"' [65] methods that have been prevalent in, e.g., neural networks; it is conceivable that our accuracy might be further improved by incorporating aspects of these schemes when combining the bare SRVM algorithm that we described in the current work with other known classifiers. Indeed, as we detail elsewhere [67], a function describing any particular neural network can be regarded as yet another member of the ensem-ble of functions used in an SRVM implementation. We stress that unlike Markov Chain Monte Carlo (MCMC) [66] methods, the crux of our general approach hinges, in the absence of given special details, on the use random stochastic functions of different types (not that of sampling from a single distribution function).…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…We remark that the use of multiple classifiers (different from our stochastic functions) to enhance accuracy further appears in other machine learning approaches such as those of unweighted "bagging" [64] or more sophisticated "boosting"' [65] methods that have been prevalent in, e.g., neural networks; it is conceivable that our accuracy might be further improved by incorporating aspects of these schemes when combining the bare SRVM algorithm that we described in the current work with other known classifiers. Indeed, as we detail elsewhere [67], a function describing any particular neural network can be regarded as yet another member of the ensem-ble of functions used in an SRVM implementation. We stress that unlike Markov Chain Monte Carlo (MCMC) [66] methods, the crux of our general approach hinges, in the absence of given special details, on the use random stochastic functions of different types (not that of sampling from a single distribution function).…”
Section: Discussionmentioning
confidence: 99%
“…Given the simplicity of our algorithm and its numerous natural extensions, much more work can be done to further streamline the algorithm and apply it to many different data sets. Aside from the numerous data set benchmarks tested in the current work, two additional materials oriented classification problems (both binary and ternary) were studied in [67]. The current results of our supervised machine learning study augment those of an earlier replica type approach for unsupervised learning and the solution of combinatorial problems in which the notions of stability and (potentially recursive) voting or information theory correlations/inference were employed [13][14][15][16][17][18][19][20]22].…”
Section: Discussionmentioning
confidence: 99%
“…In the context of density functional theory (DFT), a proof-of-principle demonstration based on a system of free fermions showed that density functionals can be accurately approximated using kernel ridge regression [32]. These works motivated a wide array of machine learning applications of dataenabled chemistry and density functional theory [33,34] with the aim of predicting, accelerating [23,35,36] and improving the prediction of atomicscale properties of materials and chemical systems (including uncertainty estimation [37]) reaching quantum chemical accuracy using DFT [38]. Proposals to bypass the solution of the Kohn-Sham equations in the DFT have demonstrated acceleration of simulations of materials and molecules [39][40][41].…”
Section: Machine Learning In Simulations Of Strongly Correlated Fermionsmentioning
confidence: 99%
“…2,3,4 However, this method requires to perform abundant DFT calculations in searching for few desired materials from thousands or even more candidates at a low efficiency. On the other hand, machine learning methodologies have been widely applied to materials and chemical sciences, [5][6][7] such as for predicting crystal stability 8 and chemical reactivity, 9 designing lithium batteries 10 and catalysts, 11 and discovering two-dimensional (2D) optoelectronic materials 12 and 2D ferromagnetic materials, 13 etc. These advances demonstrate that supervised machine learning techniques could accelerate the discovery of new functional materials compared with the high-throughput first principles calculations, but the supervised machine learning model needs a huge size of materials data to make accurate predictions, leading to a large consumption of computational resources.…”
Section: Introductionmentioning
confidence: 99%