The accurate description of chemical processes often requires the use of computationally demanding methods like density-functional theory (DFT), making long simulations of large systems unfeasible. In this Letter we introduce a new kind of neural-network representation of DFT potential-energy surfaces, which provides the energy and forces as a function of all atomic positions in systems of arbitrary size and is several orders of magnitude faster than DFT. The high accuracy of the method is demonstrated for bulk silicon and compared with empirical potentials and DFT. The method is general and can be applied to all types of periodic and nonperiodic systems.
Neural networks offer an unbiased and numerically very accurate approach to represent high-dimensional ab initio potential-energy surfaces. Once constructed, neural network potentials can provide the energies and forces many orders of magnitude faster than electronic structure calculations, and thus enable molecular dynamics simulations of large systems. However, Cartesian coordinates are not a good choice to represent the atomic positions, and a transformation to symmetry functions is required. Using simple benchmark systems, the properties of several types of symmetry functions suitable for the construction of high-dimensional neural network potential-energy surfaces are discussed in detail. The symmetry functions are general and can be applied to all types of systems such as molecules, crystalline and amorphous solids, and liquids.
Nowadays, computer simulations have become a standard tool in essentially all fields of chemistry, condensed matter physics, and materials science. In order to keep up with state-of-the-art experiments and the ever growing complexity of the investigated problems, there is a constantly increasing need for simulations of more realistic, i.e., larger, model systems with improved accuracy. In many cases, the availability of sufficiently efficient interatomic potentials providing reliable energies and forces has become a serious bottleneck for performing these simulations. To address this problem, currently a paradigm change is taking place in the development of interatomic potentials. Since the early days of computer simulations simplified potentials have been derived using physical approximations whenever the direct application of electronic structure methods has been too demanding. Recent advances in machine learning (ML) now offer an alternative approach for the representation of potential-energy surfaces by fitting large data sets from electronic structure calculations. In this perspective, the central ideas underlying these ML potentials, solved problems and remaining challenges are reviewed along with a discussion of their current applicability and limitations. Published by AIP Publishing. [http://dx
A lot of progress has been made in recent years in the development of atomistic potentials using machine learning (ML) techniques. In contrast to most conventional potentials, which are based on physical approximations and simplifications to derive an analytic functional relation between the atomic configuration and the potential-energy, ML potentials rely on simple but very flexible mathematical terms without a direct physical meaning. Instead, in case of ML potentials the topology of the potential-energy surface is "learned" by adjusting a number of parameters with the aim to reproduce a set of reference electronic structure data as accurately as possible. Due to this bias-free construction, they are applicable to a wide range of systems without changes in their functional form, and a very high accuracy close to the underlying firstprinciples data can be obtained. Neural network potentials (NNPs), which have first been proposed about two decades ago, are an important class of ML potentials. Although the first NNPs have been restricted to small molecules with only a few degrees of freedom, they are now applicable to highdimensional systems containing thousands of atoms, which enables addressing a variety of problems in chemistry, physics, and materials science. In this tutorial review, the basic ideas of NNPs are presented with a special focus on developing NNPs for high-dimensional condensed systems. A recipe for the construction of these potentials is given and remaining limitations of the method are discussed.
The accuracy of the results obtained in molecular dynamics or Monte Carlo simulations crucially depends on a reliable description of the atomic interactions. A large variety of efficient potentials has been proposed in the literature, but often the optimum functional form is difficult to find and strongly depends on the particular system. In recent years, artificial neural networks (NN) have become a promising new method to construct potentials for a wide range of systems. They offer a number of advantages: they are very general and applicable to systems as different as small molecules, semiconductors and metals; they are numerically very accurate and fast to evaluate; and they can be constructed using any electronic structure method. Significant progress has been made in recent years and a number of successful applications demonstrate the capabilities of neural network potentials. In this Perspective, the current status of NN potentials is reviewed, and their advantages and limitations are discussed.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.