One endeavour of modern physical chemistry is to use bottom-up approaches to design materials and drugs with desired properties. Here we introduce an atomistic structure learning algorithm (ASLA) that utilizes a convolutional neural network to build 2D compounds and layered structures atom by atom. The algorithm takes no prior data or knowledge on atomic interactions but inquires a first-principles quantum mechanical program for physical properties. Using reinforcement learning, the algorithm accumulates knowledge of chemical compound space for a given number and type of atoms and stores this in the neural network, ultimately learning the blueprint for the optimal structural arrangement of the atoms for a given target property. ASLA is demonstrated to work on diverse problems, including grain boundaries in graphene sheets, organic compound formation and a surface oxide structure. This approach to structure prediction is a first step toward direct manipulation of atoms with artificially intelligent first principles computer codes.
We show how to speed up global optimization of molecular structures using machine learning methods. To represent the molecular structures we introduce the auto-bag feature vector that combines: i) a local feature vector for each atom, ii) an unsupervised clustering of such feature vectors for many atoms across several structures, and iii) a count for a given structure of how many times each cluster is represented. During subsequent global optimization searches, accumulated structure-energy relations of relaxed structural candidates are used to assign local energies to each atom using supervised learning. Specifically, the local energies follow from assigning energies to each cluster of local feature vectors and demanding the sum of local energies to amount to the structural energies in the least squares sense. The usefulness of the method is demonstrated in basin hopping searches for 19-atom structures described by single-or double-well Lennard-Jones type potentials and for 24 atom carbon structures described by density functional theory (DFT). In all cases, utilizing the local energy information derived on-the-fly enhances the rate at which the global minimum energy structure is found. arXiv:1807.04605v2 [physics.comp-ph]
We demonstrate how image recognition and reinforcement learning combined may be used to determine the atomistic structure of reconstructed crystalline surfaces. A deep neural network represents a reinforcement learning agent that obtains training rewards by interacting with an environment. The environment contains a quantum mechanical potential energy evaluator in the form of a density functional theory program. The agent handles the 3D atomistic structure as a series of stacked 2D images and outputs the next atom type to place and the atomic site to occupy. Agents are seen to require 1000-10 000 single point density functional theory evaluations, to learn by themselves how to build the optimal surface reconstructions of anatase TiO 2 (001)-(1 × 4) and rutile SnO 2 (110)-(4 × 1).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.