We introduce a class of interatomic potential models that can be automatically generated from data consisting of the energies and forces experienced by atoms, as derived from quantum mechanical calculations. The models do not have a fixed functional form and hence are capable of modeling complex potential energy landscapes. They are systematically improvable with more data. We apply the method to bulk crystals, and test it by calculating properties at high temperatures. Using the interatomic potential to generate the long molecular dynamics trajectories required for such calculations saves orders of magnitude in computational cost.PACS numbers: 65.40. De,71.15.Nc,34.20.Cf Atomic scale modeling of materials is now routinely and widely applied, and encompasses a range of techniques from exact quantum chemical methods [1] through density functional theory (DFT) [2] and semi-empirical quantum mechanics [3] to analytic interatomic potentials [4]. The associated trade-offs in accuracy and computational cost are well known. Arguably, there is a gap between models that treat electrons explicitly, and those that do not. Models in the former class are in practice limited to handling a few thousand atoms, while the simple analytic interatomic potentials are limited in accuracy, regardless of how they are parametrized. The panels in the top row of Fig. 1 illustrates the typical performance of analytic potentials in bulk semiconductors. Perhaps surprisingly, potentials that are generally regarded as adequate for describing these bulk phases show significant deviation from the quantum mechanical potential energy surface. This in turn gives rise to significant errors in predicting properties such as elastic constants and phonon spectra.In this letter we are concerned with the problem of modeling the Born-Oppenheimer potential energy surface (PES) of a set of atoms, but without recourse to simulating the electrons explicitly. We mostly restrict our attention to modeling the bulk phases of carbon, silicon, germanium, iron and gallium nitride, using a unified framework. Even such single-phase potentials could be useful for calculating physical properties, e.g. the thermal expansion coefficient, the phonon contribution to the thermal conductivity, the temperature dependence of the phonon modes, or as part of QM/MM hybrid schemes [7].The first key insight is that this is actually practicable: the reason that interatomic potentials are at all useful is that the PES is a relatively smooth function of the nu- clear coordinates. Improving potential modeling is difficult not because the PES is rough, but because it does not easily decompose into simple closed functional forms. Secondly, away from isolated quantum critical points, the behavior of atoms is localized in the sense that if the total energy of a system is written as a sum of atomic energies,where r ij = r j − r i is the relative position between atoms
We review some recently published methods to represent atomic neighbourhood environments, and analyse their relative merits in terms of their faithfulness and suitability for fitting potential energy surfaces. The crucial properties that such representations (sometimes called descriptors) must have are differentiability with respect to moving the atoms, and invariance to the basic symmetries of physics: rotation, reflection, translation, and permutation of atoms of the same species. We demonstrate that certain widely used descriptors that initially look quite different are specific cases of a general approach, in which a finite set of basis functions with increasing angular wave numbers are used to expand the atomic neighbourhood density function. Using the example system of small clusters, we quantitatively show that this expansion needs to be carried to higher and higher wave numbers as the number of neighbours increases in order to obtain a faithful representation, and that variants of the descriptors converge at very different rates. We also propose an altogether new approach, called Smooth Overlap of Atomic Positions (SOAP), that sidesteps these difficulties by directly defining the similarity between any two neighbourhood environments, and show that it is still closely connected to the invariant descriptors. We test the performance of the various representations by fitting models to the potential energy surface of small silicon clusters and the bulk crystal.
Evaluating the (dis)similarity of crystalline, disordered and molecular compounds is a critical step in the development of algorithms to navigate automatically the configuration space of complex materials. For instance, a structural similarity metric is crucial for classifying structures, searching chemical space for better compounds and materials, and to drive the next generation of machine-learning techniques for predicting the stability and properties of molecules and materials. In the last few years several strategies have been designed to compare atomic coordination environments. In particular, the Smooth Overlap of Atomic Positions (SOAP) has emerged as a natural framework to obtain translation, rotation and permutation-invariant descriptors of groups of atoms, driven by the design of various classes of machine-learned inter-atomic potentials. Here we discuss how one can combine such local descriptors using a Regularized Entropy Match (REMatch) approach to describe the similarity of both whole molecular and bulk periodic structures, introducing powerful metrics that allow the navigation of alchemical and structural complexity within a unified framework. Furthermore, using this kernel and a ridge regression method we can also predict atomization energies for a database of small organic molecules with a mean absolute error below 1kcal/mol, reaching an important milestone in the application of machine-learning techniques to the evaluation of molecular properties.
Statistical learning based on a local representation of atomic structures provides a universal model of chemical stability.
We introduce a Gaussian approximation potential (GAP) for atomistic simulations of liquid and amorphous elemental carbon. Based on a machine-learning representation of the density-functional theory (DFT) potential-energy surface, such interatomic potentials enable materials simulations with close-to DFT accuracy but at much lower computational cost. We first determine the maximum accuracy that any finite-range potential can achieve in carbon structures; then, using a novel hierarchical set of two-, three-, and many-body structural descriptors, we construct a GAP model that can indeed reach the target accuracy. The potential yields accurate energetic and structural properties over a wide range of densities; it also correctly captures the structure of the liquid phases, at variance with state-of-the-art empirical potentials. Exemplary applications of the GAP model to surfaces of "diamond-like" tetrahedral amorphous carbon (ta-C) are presented, including an estimate of the amorphous material's surface energy, and simulations of high-temperature surface reconstructions ("graphitization"). The new interatomic potential appears to be promising for realistic and accurate simulations of nanoscale amorphous carbon structures.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.