2022
DOI: 10.48550/arxiv.2206.02005
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Open Challenges in Developing Generalizable Large Scale Machine Learning Models for Catalyst Discovery

Abstract: The development of machine learned potentials for catalyst discovery has predominantly been focused on very specific chemistries and material compositions. While effective in interpolating between available materials, these approaches struggle to generalize across chemical space. The recent curation of large-scale catalyst datasets has offered the opportunity to build a universal machine learning potential, spanning chemical and composition space. If accomplished, said potential could accelerate the catalyst d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
7
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(7 citation statements)
references
References 53 publications
0
7
0
Order By: Relevance
“…The underlying theoretical details of these methods are beyond the scope of this review and we refer the reader to several excellent reviews of these methods that have recently been published for more detail. [45][46][47][48][49][50][51] At a high level these methods work by dening a mapping between atomic coordinates to energies and forces (occasionally virial tensors also). This mapping contains a large number of parameters (weights and biases) that can be systematically adjusted to minimise the error on a set of training data, combined with an algorithm to systematically optimise the parameters (Backpropagation).…”
Section: Neural Network Potential Molecular Dynamics (Nnp-md)mentioning
confidence: 99%
“…The underlying theoretical details of these methods are beyond the scope of this review and we refer the reader to several excellent reviews of these methods that have recently been published for more detail. [45][46][47][48][49][50][51] At a high level these methods work by dening a mapping between atomic coordinates to energies and forces (occasionally virial tensors also). This mapping contains a large number of parameters (weights and biases) that can be systematically adjusted to minimise the error on a set of training data, combined with an algorithm to systematically optimise the parameters (Backpropagation).…”
Section: Neural Network Potential Molecular Dynamics (Nnp-md)mentioning
confidence: 99%
“…When evaluating performance, we define success as finding an adsorption energy within an acceptable tolerance (0.1 eV in this work [2,33,42]) or lower of the DFT adsorption energy in OC20-Dense. Note that the ground truth adsorption energies in OC20-Dense are an upper bound, since it is possible that a lower adsorption energy may exist.…”
Section: Relaxationsmentioning
confidence: 99%
“…Recently, machine learning (ML) potentials for estimating atomic forces and energies have shown significant progress on standard benchmarks while being orders of magnitude faster than DFT [2,[27][28][29][30][31][32]. While ML accuracies on the large and diverse Open Catalyst 2020 Dataset (OC20) dataset have improved to 0.3 eV for relaxed energy estimation, an accuracy of 0.1 eV is still desired for accurate screening [33]. This raises the question of whether a hybrid approach that uses both DFT and ML potentials can achieve high accuracy while main-taining efficiency.…”
mentioning
confidence: 99%
“…The underlying theoretical details of these methods are beyond the scope of this review and we refer the reader to several excellent reviews of these methods that have recently been published for more detail. [45][46][47][48][49][50][51] At a high level these methods work by defining a mapping between atomic coordinates to energies and forces (occasionally virial tensors also). This mapping contains a large number of parameters (weights and biases) that can be systematically adjusted to minimise the error on a set of training data, combined with an algorithm to systematically optimise the parameters (Backpropagation).…”
Section: Neural Network Potential Molecular Dynamics (Nnp-md)mentioning
confidence: 99%