2022
DOI: 10.1021/acs.jctc.2c00255
|View full text |Cite
|
Sign up to set email alerts
|

Graph Neural Networks for Learning Molecular Excitation Spectra

Abstract: Machine learning (ML) approaches have demonstrated the ability to predict molecular spectra at a fraction of the computational cost of traditional theoretical chemistry methods while maintaining high accuracy. Graph neural networks (GNNs) are particularly promising in this regard, but different types of GNNs have not yet been systematically compared. In this work, we benchmark and analyze five different GNNs for the prediction of excitation spectra from the QM9 dataset of organic molecules. We compare the GNN … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
12
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 17 publications
(12 citation statements)
references
References 37 publications
0
12
0
Order By: Relevance
“…Data-driven machine learning (ML) has recently been employed to predict density of states (DOS) and quasiparticle energies at the DFT level from only atomic configurations. Using thousands of (or more) discretized DOS on the real-frequency axis as training data, Gaussian process regression or deep neural network models are trained to predict DOS of organic molecules, bulk crystals, and amorphous materials, although the quality of results often depends on the resolution chosen to smooth the DOS . A different ML approach is based on the SchNet model, where a latent Hamiltonian matrix is first predicted and molecular resonances are obtained as eigenvalues of the effective Hamiltonian. , In the meantime, ML has also been explored to predict GW corrections to quasiparticle energy levels from DFT inputs. In addition, we note recent works in ML dielectric screening for accelerating GW and Bethe–Salpeter equation (BSE) calculations. , However, to the best of our knowledge, no current ML model predicts photoemission spectra at the general quantum many-body level beyond independent-particle approximation and properly accounts for quasiparticle renormalization and satellites, which are important spectral features in correlated electron systems.…”
Section: Introductionmentioning
confidence: 99%
“…Data-driven machine learning (ML) has recently been employed to predict density of states (DOS) and quasiparticle energies at the DFT level from only atomic configurations. Using thousands of (or more) discretized DOS on the real-frequency axis as training data, Gaussian process regression or deep neural network models are trained to predict DOS of organic molecules, bulk crystals, and amorphous materials, although the quality of results often depends on the resolution chosen to smooth the DOS . A different ML approach is based on the SchNet model, where a latent Hamiltonian matrix is first predicted and molecular resonances are obtained as eigenvalues of the effective Hamiltonian. , In the meantime, ML has also been explored to predict GW corrections to quasiparticle energy levels from DFT inputs. In addition, we note recent works in ML dielectric screening for accelerating GW and Bethe–Salpeter equation (BSE) calculations. , However, to the best of our knowledge, no current ML model predicts photoemission spectra at the general quantum many-body level beyond independent-particle approximation and properly accounts for quasiparticle renormalization and satellites, which are important spectral features in correlated electron systems.…”
Section: Introductionmentioning
confidence: 99%
“…The technological progress of ML is now manifested in nearly all branches of science and technology [1–11] . Through proper handling of powerful computation and high‐throughput experimentation, ML has expedited the scientific research and technological development [12–31] . Even though the adoption of data‐guided growth of materials is inspiring to recognize the accurate potential of ML models, they should also have the potentiality over solely predictive ability.…”
Section: Introductionmentioning
confidence: 99%
“…[1][2][3][4][5][6][7][8][9][10][11] Through proper handling of powerful computation and highthroughput experimentation, ML has expedited the scientific research and technological development. [12][13][14][15][16][17][18][19][20][21][22][23][24][25][26][27][28][29][30][31] Even though the adoption of data-guided growth of materials is inspiring to recognize the accurate potential of ML models, they should also have the potentiality over solely predictive ability. The prediction and inner execution of models must offer appropriate explanation to the human specialists.…”
Section: Introductionmentioning
confidence: 99%
“…Obtaining a quantitative insight of a particular observation either through theoretical calculations or by experimentation is always associated with the consumption of time. We employed, in this work, various deep learning and machine learning tools, such as fuzzy logic, [11][12][13][14][15][16] artificial neural network (assisted by three different training algorithms), [17][18][19][20][21][22] adaptive neuro-fuzzy inference system, [23][24][25][26][27] as well as decision tree regression analysis, [28][29][30][31][32][33][34] on our existing anion-sensing dataset of an Os(II)polyheterocyclic complex for proper understanding as well as to fully predict its anion-sensing characteristic within a very short period of time (Chart 1). 35 We utilized herein an imidazolyl bis-benzimidazole-based Os(II) complex having three azole NH motifs in its periphery (Chart 1).…”
Section: Introductionmentioning
confidence: 99%
“…Recurrent neural network (RNN) and artificial neural network-function fitting (ANN-FF) grids have been employed commonly for this purpose. [17][18][19][20][21][22] In this work, we utilized artificial neural network-function fitting technique because of the static character of the present scheme and also due to its aptitude to recognize and predict complicated systems. Artificial neural network is quite capable of grasping the data but not so skilled with regard to recognizing the significance of each neuron and its weight.…”
Section: Introductionmentioning
confidence: 99%