2020
DOI: 10.1002/cctc.202000774
|View full text |Cite
|
Sign up to set email alerts
|

Active Learning A Neural Network Model For Gold Clusters & Bulk From Sparse First Principles Training Data

Abstract: Small metal clusters are of fundamental scientific interest and of tremendous significance in catalysis. These nanoscale clusters display diverse geometries and structural motifs depending on the cluster size; a knowledge of this size-dependent structural motifs and their dynamical evolution has been of longstanding interest. Given the high computational cost of first-principles calculations, molecular modeling and atomistic simulations such as molecular dynamics (MD) has proven to be an important complementar… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
22
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
2

Relationship

3
5

Authors

Journals

citations
Cited by 22 publications
(22 citation statements)
references
References 71 publications
0
22
0
Order By: Relevance
“…Here, the unlabeled data are abundant, but labeling (getting new QM energy) is expensive. Different labeling approaches are known from the literature: Deep Potential Generator (DP-GEN) and AL-NN with a Nested Ensemble Monte Carlo scheme. To develop ANI-1x MLIP, we have generated the AL training data set with an updated on-the-fly via Query by Committee (QBC) algorithm (Figure ).…”
Section: Advanced Methods For Accuracy and Transferability Improvementmentioning
confidence: 99%
See 1 more Smart Citation
“…Here, the unlabeled data are abundant, but labeling (getting new QM energy) is expensive. Different labeling approaches are known from the literature: Deep Potential Generator (DP-GEN) and AL-NN with a Nested Ensemble Monte Carlo scheme. To develop ANI-1x MLIP, we have generated the AL training data set with an updated on-the-fly via Query by Committee (QBC) algorithm (Figure ).…”
Section: Advanced Methods For Accuracy and Transferability Improvementmentioning
confidence: 99%
“…Here, the unlabeled data are abundant, but labeling (getting new QM energy) is expensive. Different labeling approaches are known from the literature: Deep Potential Generator (DP-GEN) 44 and AL-NN 45 Transfer Learning…”
Section: Active Learningmentioning
confidence: 99%
“…We further note that the present work did not aim for developing an ML model with a minimal amount of data; instead, it focused on the proof of concept of an ML-based characterization of polymer phase transformation. Therefore, it is quite possible to build ML models with even lower amounts of data, especially advanced concepts such as active learning to build ML models with sparse data. Our future works will focus on building an ML-based polymer phase characterization technique with a minimal amount of data and a holistic comparison of the MD–ML approach with a pure MD route across diverse polymeric systems, including chemically realistic models. Although dPOLY is tested for the coil to globule transition, we expect this approach to identify other phase transitions and dynamical crossovers.…”
Section: Discussionmentioning
confidence: 99%
“…All calculations were done with an energy cutoff of 600 eV. A dense -point grid defined by , where is the number of atoms in the primitive cell and is the number of -points, is employed 100 , 101 . The phonon modes were computed from the Hessian matrix, obtained from density functional perturbation theory, using the PHONOPY package 102 .…”
Section: Methodsmentioning
confidence: 99%