2019
DOI: 10.1021/acs.jpcc.9b03370
|View full text |Cite
|
Sign up to set email alerts
|

Lattice Convolutional Neural Network Modeling of Adsorbate Coverage Effects

Abstract: Coverage effects, known also as lateral interactions, are often important in surface processes, but their study via exhaustive density functional theory (DFT) is impractical because of the large configurational degrees of freedom. The cluster expansion (CE) is the most popular surrogate model accounting for coverage effects but suffers from slow convergence, its linear form, and its tendency to be biased toward the selection of smaller clusters. We develop a novel lattice convolutional neural network (LCNN) th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
36
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 33 publications
(36 citation statements)
references
References 43 publications
0
36
0
Order By: Relevance
“…Previously successful deep learning models are carefully designed to account for the underlying physics between the representation and the properties of interest. A number of deep learning models are available some of which are: crystal graph convolutional neural network (CGCNN), [141] a lattice convolutional neural network (LCNN), [142] atom centered symmetry function (ACSF), [143] and SchNet. [144] CGCNN is a graph convolutional neural network, relying on the molecular graph theory representation of physical systems (Figure 8a,b).…”
Section: Machine Learning Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Previously successful deep learning models are carefully designed to account for the underlying physics between the representation and the properties of interest. A number of deep learning models are available some of which are: crystal graph convolutional neural network (CGCNN), [141] a lattice convolutional neural network (LCNN), [142] atom centered symmetry function (ACSF), [143] and SchNet. [144] CGCNN is a graph convolutional neural network, relying on the molecular graph theory representation of physical systems (Figure 8a,b).…”
Section: Machine Learning Methodsmentioning
confidence: 99%
“…CGCNN has been highly successful for describing wide variety of crystal systems. LCNN [142] is another graph convolutional neural network that is developed for lattice (Figure 8c,d). LCNN takes advantages of the symmetry within the lattice.…”
Section: Machine Learning Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…To address these problems, workflows and algorithms have been developed to provide initial guesses of adsorbate geometries, which are then refined through DFT calculations, and to efficiently incorporate simulated data into databases for further analysis [13][14][15][16][17][18][19][20][21] . Various methods, including cluster expansions and genetic algorithms, have been employed to model high coverages of adsorbates on pure metal surfaces [22][23][24][25] . Although successful advances have been made, analysis of high coverages of adsorbates on low symmetry surfaces, such as steps and multi-elemental alloys, remains challenging, and there exists no generalized procedure to systematically simulate these cases.…”
Section: Introductionmentioning
confidence: 99%
“…[15][16][17][18] These concepts are increasingly being adapted for solid state materials, often involving materials representations that combine properties of the constituent elements with structural features. [19][20][21][22][23][24][25][26][27][28] While these approaches are being deployed to good effect, they are not applicable in the common scenario of experimental materials science wherein a given material is composed of a mixture of phases, or even more so when no knowledge of the phases is available. In exploratory research for materials with specific properties, measurement and interpretation of composition and property data are often far less expensive than measurement and interpretation of structural data.…”
Section: Introductionmentioning
confidence: 99%