2023
DOI: 10.1063/5.0146803
|View full text |Cite
|
Sign up to set email alerts
|

ænet-PyTorch: A GPU-supported implementation for machine learning atomic potentials training

Abstract: In this work, we present ænet-PyTorch, a PyTorch-based implementation for training artificial neural network-based machine learning interatomic potentials. Developed as an extension of the atomic energy network (ænet), ænet-PyTorch provides access to all the tools included in ænet for the application and usage of the potentials. The package has been designed as an alternative to the internal training capabilities of ænet, leveraging the power of graphic processing units to facilitate direct training on forces … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(6 citation statements)
references
References 63 publications
0
6
0
Order By: Relevance
“…It should be noted that our focus is placed on the readability of the code implementation, rather than the software modularity or run‐time efficiency. Once learning the basics through this tutorial, the readers can adopt advanced software platforms, such as DeePMD‐kit, 22,59,60 ænet, 61,62 AMP, 63 MLatom, PhysNet, 37 SchNetPack, 36 sGDML, 64 TorchANI, 20 and TorchMD‐NET 65 for their own machine learning model development. It should also be noted that there are several other areas of research that are not covered.…”
Section: Discussionmentioning
confidence: 99%
“…It should be noted that our focus is placed on the readability of the code implementation, rather than the software modularity or run‐time efficiency. Once learning the basics through this tutorial, the readers can adopt advanced software platforms, such as DeePMD‐kit, 22,59,60 ænet, 61,62 AMP, 63 MLatom, PhysNet, 37 SchNetPack, 36 sGDML, 64 TorchANI, 20 and TorchMD‐NET 65 for their own machine learning model development. It should also be noted that there are several other areas of research that are not covered.…”
Section: Discussionmentioning
confidence: 99%
“…In this work, reference energy information and a 10% amount of reference force information in the training dataset underwent 5000 training epochs. We employed the aenet-PyTorch 34) software package as a graphics processing unit (GPU)-accelerated training framework of the BPNN. Note that we trained the BPNN with the following hyperparameters: 128 batch size, 10 −4 learning rate, and 10 −4 weight decay.…”
Section: Neural Network Interatomic Potentialmentioning
confidence: 99%
“…59,60 The development of highly accurate and efficient machine learning potentials by learning from ab initio data for describing electrocatalytic processes is promising. 60 Improvements to algorithms 61,62 and computational data ecosystems 63,64 will help to discover more about the phenomena buried at the solid−electrolyte interfaces. Finally, the complexity of electrochemical systems, such as the presence of multiple reaction pathways and the coupling of physical processes at different scales, e.g., mass transport, can also pose challenges for computational modeling.…”
Section: ■ Current Limitations and Challenges In Computational Modelingmentioning
confidence: 99%
“…One requirement for the accuracy and efficiency of such enhanced sampling methods is an accurate potential energy surface, which can potentially be provided by well-trained machine learning potentials. , The development of highly accurate and efficient machine learning potentials by learning from ab initio data for describing electrocatalytic processes is promising . Improvements to algorithms , and computational data ecosystems , will help to discover more about the phenomena buried at the solid–electrolyte interfaces.…”
Section: Current Limitations and Challenges In Computational Modelingmentioning
confidence: 99%