2021
DOI: 10.1021/acs.jpcc.1c01888
|View full text |Cite
|
Sign up to set email alerts
|

Computation of the Thermal Expansion Coefficient of Graphene with Gaussian Approximation Potentials

Abstract: Direct experimental measurement of thermal expansion coefficient without substrate effects is a challenging task for two-dimensional (2D) materials, and its accurate estimation with large-scale ab initio molecular dynamics is computationally very expensive. Machine learning-based interatomic potentials trained with ab initio data have been successfully used in molecular dynamics simulations to decrease the computational cost without compromising the accuracy. In this study, we investigated using Gaussian appro… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 53 publications
1
4
0
Order By: Relevance
“…C. Sevik et al proved that the GAP model catches the lattice dynamics of graphene very well. The estimated TEC value is very close to the DFT‐calculated result [14a] …”
Section: Ml‐based Prediction and Recognitionsupporting
confidence: 75%
See 1 more Smart Citation
“…C. Sevik et al proved that the GAP model catches the lattice dynamics of graphene very well. The estimated TEC value is very close to the DFT‐calculated result [14a] …”
Section: Ml‐based Prediction and Recognitionsupporting
confidence: 75%
“…The machine‐learning interatomic potentials (MLIPs) are a powerful assistant to MD studies. [ 14 ] A. Michaelides et al developed an ML model to obtain a faithful representation of the DFT‐PES, constructing an accurate interatomic potential (GAP) for graphene. The GAP can quantitatively predict the lattice parameter, thermal expansion coefficient, and phonon properties of graphene with only a marginal compromise on accuracy.…”
Section: Ml‐based Prediction and Recognitionmentioning
confidence: 99%
“…In this study, we focus on GAP models, which typically require less data to be trained than neural network potentials, and have good scalability and computational efficiency for large-scale molecular dynamics simulations. In a previous study, 8 we demonstrated that GAP models trained with DFT calculations 9 provide accurate estimates of the thermal expansion properties of graphene. Other studies have also shown the success of GAP models for thermal properties of 2DMs, such as graphene, 10 carbon allotropes, 11 monolayer h-BN, 12 h-BN allotropes, 13 silicene, 6,14 and monolayer MoS 2 .…”
Section: Momentmentioning
confidence: 91%
“…In this study, we focus on GAP models, which typically require less data to be trained than neural network potentials, and have good scalability and computational efficiency for large-scale molecular dynamics simulations. 12 In a previous study, 13 we demonstrated that GAP models trained with DFT calculations 14 provide accurate estimates of the thermal expansion properties of graphene, along with the dominant effect of the rippling/buckling on negative thermal expansion. 15 Other studies have also shown the success of GAP models for thermal properties of 2DMs, such as graphene, 16 carbon allotropes, 17 monolayer h -BN, 18 h -BN allotropes, 19 silicene, 11,20 and monolayer MoS 2 .…”
Section: Introductionmentioning
confidence: 87%