2020
DOI: 10.1021/acs.jpcc.0c00329
|View full text |Cite
|
Sign up to set email alerts
|

Deep Learning for Optoelectronic Properties of Organic Semiconductors

Abstract: Atomistic modeling of energetic disorder in organic semiconductors (OSCs) and its effects on the optoelectronic properties of OSCs requires a large number of excitedstate electronic-structure calculations, a computationally daunting task for many OSC applications. In this work, we advocate the use of deep learning to address this challenge and demonstrate that state-of-the-art deep neural networks (DNNs) are capable of predicting the electronic properties of OSCs at an accuracy comparable with the quantum chem… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
50
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 47 publications
(52 citation statements)
references
References 138 publications
2
50
0
Order By: Relevance
“…88 In our previous work, we have shown that SchNet can reliably predict various electronic properties (e.g., HOMO-LUMO gap and excited-state energies) of OTs up to 6T with the average errors in the range of 20-80 meV. 40 It is worth emphasizing that the transfer learning protocol advocated here does not depend on the specific underlying ML model, a similar performance improvement could be obtained even when a different ML model is used, as demonstrated in the Supporting Information (SI) where we use a multilevel graph convo- lutional neural network (MGCN) as the underlying ML model. 48 The transfer learning protocol used in this work is outlined in Fig.…”
Section: B Model Trainingmentioning
confidence: 97%
See 2 more Smart Citations
“…88 In our previous work, we have shown that SchNet can reliably predict various electronic properties (e.g., HOMO-LUMO gap and excited-state energies) of OTs up to 6T with the average errors in the range of 20-80 meV. 40 It is worth emphasizing that the transfer learning protocol advocated here does not depend on the specific underlying ML model, a similar performance improvement could be obtained even when a different ML model is used, as demonstrated in the Supporting Information (SI) where we use a multilevel graph convo- lutional neural network (MGCN) as the underlying ML model. 48 The transfer learning protocol used in this work is outlined in Fig.…”
Section: B Model Trainingmentioning
confidence: 97%
“…DFT was employed to compute the HOMO-LUMO gap, and TDDFT with the Tamm-Dancoff approximation was used for the excitation energy. CAM-B3LYP functional and 6-31+G(d) basis set were chosen based on the agreement with coupled cluster calculations in our previous study, 40 where the average error for the excitation energies of OTs up to 6T using CAM-B3LYP/6-31+G(d) was estimated to be around 200-300 meV. All the quantum chemical calculations were performed using the PySCF program, 83 and density fitting was adopted with the heavy-aug-cc-pvdz-jkfit auxiliary basis set, as implemented in PySCF.…”
Section: A Data Generationmentioning
confidence: 99%
See 1 more Smart Citation
“…By employing ML methods to make predictions from coarse-grained representations, their model significantly accelerates the prediction of optoelectronic properties of conjugated polymer systems. Lu et al 208 employed four state-of-the-art DNNs to predict the optoelectronic properties of oligothiophenes (OTs) which are organic semiconductor materials being explored for use in a range of optoelectronic devices. The four DNNs differ in their molecular representations and structure; (a) a deep tensor neural network (DTNN), (b) SchNet, a similar representation but using a continuous CNN (that can work with unequally spaced data, rather than pixels) (c) a message-passing neural network (MPNN) and (d) a multilevel graph convolutional neural network (MGCNN) with a similar representation to the MPNN.…”
Section: Structure-property Relationshipsmentioning
confidence: 99%
“…The specific GNN model we choose for our transfer learning protocol is SchNet since it has been shown in our previous study to provide the best performance for a range of ground and excited state properties of OTs up to 6T. 40 Similar to other GNN models, SchNet is capable of automatically extracting optimal representations from molecular configurations without resorting to the more traditional approach of manually designing descriptors such as Coulomb matrices, 84,85 bags of bonds, 86 smooth overlap of atomic positions 87 or generalized symmetry functions. 88 In our previous work, we have shown that SchNet can reliably predict various electronic properties (e.g., HOMO-LUMO gap and excited-state energies) of OTs up to 6T with the average errors in the range of 20-80 meV.…”
Section: B Model Trainingmentioning
confidence: 99%