Machine-learning-based interatomic potential energy surface
(PES)
models are revolutionizing the field of molecular modeling. However,
although much faster than electronic structure schemes, these models
suffer from costly computations via deep neural networks to predict
the energy and atomic forces, resulting in lower running efficiency
as compared to the typical empirical force fields. Herein, we report
a model compression scheme for boosting the performance of the Deep
Potential (DP) model, a deep learning-based PES model. This scheme,
we call DP Compress, is an efficient postprocessing step after the
training of DP models (DP Train). DP Compress combines several DP-specific
compression techniques, which typically speed up DP-based molecular
dynamics simulations by an order of magnitude faster and consume an
order of magnitude less memory. We demonstrate that DP Compress is
sufficiently accurate by testing a variety of physical properties
of Cu, H2O, and Al–Cu–Mg systems. DP Compress
applies to both CPU and GPU machines and is publicly available online.
High-performance computing, together with a neural network model trained from data generated with first-principles methods, has greatly boosted applications of ab initio molecular dynamics in terms of spatial and temporal scales on modern supercomputers. Previous state-of-the-art can achieve 1 − 2 nanoseconds molecular dynamics simulation per day for 100-million atoms on the entire Summit supercomputer. In this paper, we have significantly reduced the memory footprint and computational time by a comprehensive approach with both algorithmic and system innovations. The neural network model is compressed by model tabulation, kernel fusion, and redundancy removal. Then optimizations such as acceleration of customized kernel, tabulation of activation function, MPI+OpenMP parallelization are implemented on GPU and ARM architectures. Testing results of the copper system show that the optimized code can scale up to the entire machine of both Fugaku and Summit, and the corresponding system size can be extended by a factor of 134 to an Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for thirdparty components of this work must be honored. For all other uses, contact the owner/author(s).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.