2023
DOI: 10.1039/d3sc02581k
|View full text |Cite
|
Sign up to set email alerts
|

Force-field-enhanced neural network interactions: from local equivariant embedding to atom-in-molecule properties and long-range effects

Thomas Plé,
Louis Lagardère,
Jean-Philip Piquemal

Abstract: We introduce FENNIX (Force-Field-Enhanced Neural Network InteraXions), a hybrid approach between machine-learning and force-fields.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
9
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7

Relationship

2
5

Authors

Journals

citations
Cited by 10 publications
(9 citation statements)
references
References 96 publications
0
9
0
Order By: Relevance
“…Consequently, intensive work has focused on incorporating long-range interactions in NNPs. [1][2][3]26 Common strategies involve combining a local NNP with physical electrostatic potentials or incorporating nonlocal representations of the system into local descriptors. 2,3,27 Another approach is to integrate them with existing force fields (FFs).…”
Section: ■ Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Consequently, intensive work has focused on incorporating long-range interactions in NNPs. [1][2][3]26 Common strategies involve combining a local NNP with physical electrostatic potentials or incorporating nonlocal representations of the system into local descriptors. 2,3,27 Another approach is to integrate them with existing force fields (FFs).…”
Section: ■ Introductionmentioning
confidence: 99%
“…For example, the great ability of NNPs to capture local energy has been exploited to correct the shortrange two-body intermolecular interactions in the ARROW FF. 39 Alternatively, toward enabling full reactivity, the FENNIX (Force-Field-Enhanced Neural Network InteraXions), 26 a hybrid approach between machine-learning and force-fields has also been introduced. It uses equivariant neural networks to predict all short-range many-body interactions, while it is able to account for long-range electrostatics and dispersion via the prediction of local energy contributions and multiple atom-in-molecule properties that are used to predict the physically motivated, geometry-dependent energy terms.…”
Section: ■ Introductionmentioning
confidence: 99%
“…The past few years have seen an explosion of new MLP architectures designed to improve their accuracy, speed, transferability, and data efficiency. Many of these architectures try to mirror the symmetries obeyed by the physical system, particularly equivariance under translations and rotations. Others try to improve transferability by directly incorporating a limited amount of physics into the model, such as explicit terms for Coulomb interactions, dispersion, and nuclear repulsion. Still others take this approach even further, beginning with a semiempirical QC method and training a machine learning model to correct for its flaws and improve its accuracy. These approaches provide different trade-offs between speed, accuracy, and range of applicability, with new architectures continuing to be introduced.…”
Section: Introductionmentioning
confidence: 99%
“…The pioneering work of Behler showcased that deep neural networks (NN) could be employed to learn a computationally cheaper surrogate model for the density functional theory potential energy surface of bulk silicon . Following the initial studies, other ML algorithms and NN architectures rapidly emerged. ML algorithms can be utilized to construct not only all-atom models but also CG models. , Previous studies demonstrated a successful reproduction of structural and dynamical properties. However, most considered pure water solvent or implicitly treated the ions, even though they play a vital role in biological processes.…”
Section: Introductionmentioning
confidence: 99%
“…This work presents a deep implicit solvation (DIS) model for sodium chloride solutions, where water is coarse-grained out, whereas ions are modeled explicitly. The ML potential is based on an equivariant neural network (ENN) architecture due to its impressive data efficiency and the ability to generalize more accurately to out-of-distribution configurations. ,,,,, As stated by the Allegro’s developers “the strict locality of the Allegro model naturally facilitates separation of the energy into a short-range term and a physically motivated long-range term.” Thus, rather than directly fitting the potential of mean force, we define a prior potential composed of the Lennard-Jones and electrostatic interactions. The ML potential is trained to account for the difference between the all-atom data and the prior potential, an approach also known as delta learning .…”
Section: Introductionmentioning
confidence: 99%