2022
DOI: 10.26434/chemrxiv-2022-mdz85
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Neural Network Potential with Rigorous Treatment of Long-Range Dispersion

Abstract: Neural Network Potentials (NNPs) have quickly emerged as powerful computational methods for modeling large chemical systems with the accuracy of quantum mechanical methods but at a much smaller computational cost. To make the training and evaluation of the underlying neural networks practical, these methods commonly cutoff interatomic interactions at a modest range (e.g., 5~\AA), so longer-range interactions like London dispersion are neglected. This limits the accuracy of these models for intermolecular inter… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 0 publications
0
3
0
Order By: Relevance
“…Certain applications may require more direct treatment of long-range effects. Future work could investigate incorporating recent developments, such as explicit long-range terms 63 , charge equilibration schemes 64 or graph NN models 5,6,[13][14][15][16] that can implicitly account for long-range interactions. A recent advancement in ML for natural language processing is the concept of foundational models, that is, large, general models usually trained with unlabelled data that can be specialized to specific tasks quickly with very small amounts of data 65 .…”
Section: Discussionmentioning
confidence: 99%
“…Certain applications may require more direct treatment of long-range effects. Future work could investigate incorporating recent developments, such as explicit long-range terms 63 , charge equilibration schemes 64 or graph NN models 5,6,[13][14][15][16] that can implicitly account for long-range interactions. A recent advancement in ML for natural language processing is the concept of foundational models, that is, large, general models usually trained with unlabelled data that can be specialized to specific tasks quickly with very small amounts of data 65 .…”
Section: Discussionmentioning
confidence: 99%
“…Such performances should continue to improve thanks to further Deep-HP optimizations, TorchANI updates and GPUs hardware evolutions. Deep-HP will enable the implementation of the next generation of improved MLPs [84][85][86] and has been designed to be a place for their further development. It will include direct neural networks coupling with physics-driven contributions going beyond multipolar electrostatics and polarization through inclusion of many-body dispersion models.…”
Section: Discussionmentioning
confidence: 99%
“…A more optimal solution would be to add an explicit dispersion correction to ANI-1xnr that captures long-range interactions while maintaining an accurate description of the local environment. 45 We also compare the ANI-1xnr lattice constants with those from ANI-2x, 46 a model explicitly trained to small organic molecules as a baseline. ANI-2x performs poorly at predicting the lattice constants for both diamond cubic and graphite.…”
Section: Carbon Solid-phase Nucleationmentioning
confidence: 99%