2023
DOI: 10.1039/d2dd00150k
|View full text |Cite
|
Sign up to set email alerts
|

A neural network potential with rigorous treatment of long-range dispersion

Abstract: Neural Network Potentials (NNPs) have quickly emerged as powerful computational methods for modeling large chemical systems with the accuracy of quantum mechanical methods but at a much smaller computational cost....

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 61 publications
0
6
0
Order By: Relevance
“…Such performances should continue to improve thanks to further Deep-HP optimizations, TorchANI updates and GPU hardware evolutions. Deep-HP will enable the implementation of the next generation of improved MLPs [84][85][86] and has been designed to be a place for their further development. It will include direct neural network coupling with physics-driven contributions going beyond multipolar electrostatics and polarization through the inclusion of many-body dispersion models.…”
Section: Discussionmentioning
confidence: 99%
“…Such performances should continue to improve thanks to further Deep-HP optimizations, TorchANI updates and GPU hardware evolutions. Deep-HP will enable the implementation of the next generation of improved MLPs [84][85][86] and has been designed to be a place for their further development. It will include direct neural network coupling with physics-driven contributions going beyond multipolar electrostatics and polarization through the inclusion of many-body dispersion models.…”
Section: Discussionmentioning
confidence: 99%
“…We now consider different ANI models to rerank the matches found in Section 3.1. We consider ANI1x and ANI2x from TorchANI, 57 ANI1x with dispersion 55 and ANIOE62�a model we introduced. The model name is derived from the OE62 data set, which is a collection of 61,489 molecules extracted from organic crystals in the CSD.…”
Section: Comparing the Ranking Ability Of Other Ani Modelsmentioning
confidence: 99%
“…For the case of finding low-energy polymorphs, we calculated the total energy of crystal structures. To rapidly compute total energies and perform structural relaxations, we chose to incorporate the ANI ML models with 55 and without 31−33 dispersion corrections, into our pipeline. The ANI models are trained on millions of organic molecules and are accurate across different domains.…”
Section: Introductionmentioning
confidence: 99%
“…10,14,15,24−30 MLPs trained on active learned data tend to yield more stable molecular dynamics simulations. 31,32 MLPs have been successfully applied to predict potential energy surfaces 7−9,11,15,23,33−36 and have been extended to charges, 12,37−40 spin, 41 dispersion coefficients, 42 and bond-order quantities. 43 Training data sets are typically obtained with density functional theory (DFT), which serves as a reasonably accurate and numerically accessible reference QM approach.…”
Section: Introductionmentioning
confidence: 99%