2018
DOI: 10.1038/s41524-018-0110-y
|View full text |Cite
|
Sign up to set email alerts
|

Machine-learning the configurational energy of multicomponent crystalline solids

Abstract: Machine learning tools such as neural networks and Gaussian process regression are increasingly being implemented in the development of atomistic potentials. Here, we develop a formalism to leverage such non-linear interpolation tools in describing properties dependent on occupation degrees of freedom in multicomponent solids. Symmetry-adapted cluster functions are used to differentiate distinct local orderings. These local features are used as input to neural networks that reproduce local properties such as t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
51
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 69 publications
(51 citation statements)
references
References 67 publications
0
51
0
Order By: Relevance
“…The prediction of crystal structures and their stability [399,400] has also been performed for several materials such as perovskites [287,[401][402][403], superhard materials [404], bcc materials and Fe alloys [405], binary alloys [406], phosphor hosts [407], Heuslers [408,409], catalysts [410], amorphous carbon [411], high-pressurehydrogen-compressor materials [412], binary intermetallic compounds with transition metals [413], and multicomponent crystalline solids [414]. An atomic-position independent descriptor was able to reach a MAE of 70 meV/atom for formation energy predictions of a diverse dataset of more than 85 000 materials [415].…”
Section: Discovery Energies and Stabilitymentioning
confidence: 99%
See 1 more Smart Citation
“…The prediction of crystal structures and their stability [399,400] has also been performed for several materials such as perovskites [287,[401][402][403], superhard materials [404], bcc materials and Fe alloys [405], binary alloys [406], phosphor hosts [407], Heuslers [408,409], catalysts [410], amorphous carbon [411], high-pressurehydrogen-compressor materials [412], binary intermetallic compounds with transition metals [413], and multicomponent crystalline solids [414]. An atomic-position independent descriptor was able to reach a MAE of 70 meV/atom for formation energy predictions of a diverse dataset of more than 85 000 materials [415].…”
Section: Discovery Energies and Stabilitymentioning
confidence: 99%
“…Gaussian approximation potentials (GAPs) have been extensively used to study different systems, such as elemental boron [422], amorphous carbon [423,424], silicon [425], thermal properties of amorphous GeTe and carbon [426], thermomechanics and defects of iron [427], prediction structures of inorganic crystals by combing ML with random search [428], λ-SOAP method for tensorial properties of atomistic systems [247], and a unified framework to predict the properties of materials and molecules such as silicon, organic molecules and proteins ligands [429]. A recent review of applications of high-dimensional neural neural network potentials [430] summarized the notable number of molecular and materials systems studied, which ranges from simple semiconductors such as silicon [233,431,432] and ZnO [433], to more complex systems such as water and metallic clusters [434], molecules [435][436][437], surfaces [438,439], and liquid/solid interfaces [414,440]. Force fields for nanoclusters have been developed with 2-, 3-, and many-body descriptors [441], and the hydrogen adsorption on nanoclusters was described with structural descriptors such as SOAP [442].…”
Section: Discovery Energies and Stabilitymentioning
confidence: 99%
“…This communication adds to our nascent, but growing body of work in machine learning and artificial intelligence targeting higher fidelity models of ma-terials physics [12,13]. We have explored machine learning as an approach to bridging scales, by focusing on the representation of complexity emerging from fine scale physics.…”
Section: Discussionmentioning
confidence: 99%
“…[8] The current work extends the linear models in [8] by allowing the functional form of the cluster energy to be learned by the machine learning model. The site based model using cluster basis functions is an extension to vibrational energy of the site-based neural-net approach introduced by Natarajan and Van der Ven for modeling configurational energy [41]. One benefit of the site-based model is that it allows interaction terms between basis functions from different clusters, which may explain the lower error achieved by the site-based model.…”
Section: Discussionmentioning
confidence: 99%