2018
DOI: 10.1021/acs.jctc.8b00908
|View full text |Cite
|
Sign up to set email alerts
|

SchNetPack: A Deep Learning Toolbox For Atomistic Systems

Abstract: SchNetPack is a toolbox for the development and application of deep neural networks to the prediction of potential energy surfaces and other quantum-chemical properties of molecules and materials. It contains basic building blocks of atomistic neural networks, manages their training and provides simple access to common benchmark datasets. This allows for an easy implementation and evaluation of new models. For now, SchNetPack includes implementations of (weighted) atomcentered symmetry functions and the deep t… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
412
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 362 publications
(412 citation statements)
references
References 40 publications
0
412
0
Order By: Relevance
“…One of the first deep learning architectures to learn to represent molecules or materials is the family of Deep Tensor Neural Networks (DTNN) [14], with its recent addition SchNet [54,105]. While in kernel-based learning methods [106,2] chemical compounds are compared in terms of pre-specified kernel functions [8,107,108,109], DTNN and its extension SchNet learn a multi-scale representation of the properties of molecules or materials from large data sets.…”
Section: Deep Tensor Neural Nets Schnet and Continuous Convolutionsmentioning
confidence: 99%
See 1 more Smart Citation
“…One of the first deep learning architectures to learn to represent molecules or materials is the family of Deep Tensor Neural Networks (DTNN) [14], with its recent addition SchNet [54,105]. While in kernel-based learning methods [106,2] chemical compounds are compared in terms of pre-specified kernel functions [8,107,108,109], DTNN and its extension SchNet learn a multi-scale representation of the properties of molecules or materials from large data sets.…”
Section: Deep Tensor Neural Nets Schnet and Continuous Convolutionsmentioning
confidence: 99%
“…DTNN and SchNet have both reached highly competitive prediction quality both across chemical compound space and across configuration space in order to simulate molecular dynamics. In addition to their prediction quality, their scalability to large data sets [105] and their ability to extract novel chemical insights by means of their learnt representation make the DTNN family an increasingly popular research tool.…”
Section: Deep Tensor Neural Nets Schnet and Continuous Convolutionsmentioning
confidence: 99%
“…Besides general purpose ML tools such as scikit‐learn, tensorflow, and Pytorch, there has been an explosion of customized open‐source ML software libraries for materials science. A nonexhaustive list includes AutoMatminer, PROPhet for general materials ML; amp, ænet, and ANI for developing neural network potentials; CGCNN, MEGNet, and SchnetPack are graph‐based deep learning model packages for accurate crystal and/or molecule property modeling.…”
Section: Model Selection and Trainingmentioning
confidence: 99%
“…A very successful example is the Gaussian Approximation Potential (GAP) by Bartók et al that is based on Gaussian process regression [38]. Various other MLP approaches and implementations have been developed in recent years [51][52][53][54][55][56][57][58][59].…”
Section: Progress In Machine Learning Methods For Materials Simulationsmentioning
confidence: 99%