2022
DOI: 10.1063/5.0083301
|View full text |Cite
|
Sign up to set email alerts
|

An orbital-based representation for accurate quantum machine learning

Abstract: We introduce an electronic structure based representation for quantum machine learning (QML) of electronic properties throughout chemical compound space. The representation is constructed using computationally inexpensive ab initio calculations and explicitly accounts for changes in the electronic structure. We demonstrate the accuracy and flexibility of resulting QML models when applied to property labels, such as total potential energy, HOMO and LUMO energies, ionization potential, and electron affinity, usi… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
21
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(21 citation statements)
references
References 64 publications
0
21
0
Order By: Relevance
“…In our previous studies, we have illustrated the excellent accuracy and transferability of MOB-ML to learn molecular energies using two thermalized organic molecule datasets, i.e., QM7b-T and GDB-13-T. 37,38,46,59 In this study, we systematically examine the learning performance of MOB-ML for both the dipole and energy using the benchmark organic chemistry dataset QM9, 68 which contains optimized structures of 133885 molecules with up to nine heavy atoms (HAs) of C, O, N, and F. QM9 is a standard benchmark dataset that has been assessed in many different literature studies. 29,36,41,[49][50][51][52][53][54][55][56][57][58] Figure 4 displays the predicted MAEs for dipole moments (in mDebye) and energies (in kcal/mol) as functions of number of training geometries on a log-log scale (learning curves). Since our GPR regression, AltBBMM, can only train at most 1 million points, we collect the results of MOB-ML (GPR) up to training on 1000 and 2000 dipole moments and molecular energies, respectively.…”
Section: B Mob-ml For Dipole Moments and Energies Of Organic Molecule...mentioning
confidence: 99%
See 1 more Smart Citation
“…In our previous studies, we have illustrated the excellent accuracy and transferability of MOB-ML to learn molecular energies using two thermalized organic molecule datasets, i.e., QM7b-T and GDB-13-T. 37,38,46,59 In this study, we systematically examine the learning performance of MOB-ML for both the dipole and energy using the benchmark organic chemistry dataset QM9, 68 which contains optimized structures of 133885 molecules with up to nine heavy atoms (HAs) of C, O, N, and F. QM9 is a standard benchmark dataset that has been assessed in many different literature studies. 29,36,41,[49][50][51][52][53][54][55][56][57][58] Figure 4 displays the predicted MAEs for dipole moments (in mDebye) and energies (in kcal/mol) as functions of number of training geometries on a log-log scale (learning curves). Since our GPR regression, AltBBMM, can only train at most 1 million points, we collect the results of MOB-ML (GPR) up to training on 1000 and 2000 dipole moments and molecular energies, respectively.…”
Section: B Mob-ml For Dipole Moments and Energies Of Organic Molecule...mentioning
confidence: 99%
“…Similarly, the prediction errors of energies from MOB-ML approaches are compared with SchNet, 29 SLATM, 56 PhysNet, 36 SOAP, 57 FCHL18, 55 OrbNet-Equi, 51 and QML 49 in Fig. 4b.…”
Section: B Mob-ml For Dipole Moments and Energies Of Organic Molecule...mentioning
confidence: 99%
“…To further investigate the effects of changing learning protocol to KA-GPR on the accuracy and transferability of large closed-shell organic benchmark systems, including QM9, QM7b-T, and GDB-13-T, tested in the previous studies. 20,21,29,31,41 In this work, we compare the results of three MO-based or AO-based ML approaches using Delta−learning ideas, MOB-ML, 14,20,21,29,31,41,42 OrbNet-Equi (∆-learning) 34 and QML (MO, ∆-learning), 32 for these benchmark datasets. We note that QML (MO, ∆-learning) also adapts the kernel addition technique in the kernel ridge regression to construct its models.…”
Section: Resultsmentioning
confidence: 99%
“…This framework could be considered as an implicit one-body decomposition of the correlation energies. The successes of OrbNet 24,34 and QML (MO, ∆-learning) 32 approaches, which both use one-body features, also exhibit that it is sufficient for ML to provide accurate predictions with implicit one-body decompositions. Also, the kernel construction costs of including off-diagonal contributions will significantly increase the computational cost of the kernel construction from O(N 2 ) to O(N 4 ), where N is the number of occupied orbitals.…”
Section: B Mob-ml Features Generated From Rohf For Open-shell Systemsmentioning
confidence: 99%
See 1 more Smart Citation