2023
DOI: 10.22541/essoar.168614667.71811888/v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Emulation of cloud microphysics in a climate model

Abstract: We present a machine learning based emulator of a microphysics scheme for condensation and precipitation processes (Zhao-Carr) used operationally in a global atmospheric forecast model (FV3GFS). Our tailored emulator architecture achieves high skill (≥94%) in predicting condensate and precipitation amounts and maintains low global-average bias (≤4%) for 1 year of continuous simulation when replacing the Fortran scheme. The stability and success of this emulator stems from key design decisions. By separating th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 1 publication
0
3
0
Order By: Relevance
“…The code and configurations used to produce training data, train ML models, and run FV3GFS simulations are available on Github (Perkins et al., 2023a, https://github.com/ai2cm/zc-emulation-manuscript) and archived on Zenodo (https://doi.org/10.5281/zenodo.7976184). The data and docker images to reproduce results with the code are available on Zenodo (Perkins et al., 2023b, https://doi.org/10.5281/zenodo.10637644).…”
Section: Data Availability Statementmentioning
confidence: 99%
See 1 more Smart Citation
“…The code and configurations used to produce training data, train ML models, and run FV3GFS simulations are available on Github (Perkins et al., 2023a, https://github.com/ai2cm/zc-emulation-manuscript) and archived on Zenodo (https://doi.org/10.5281/zenodo.7976184). The data and docker images to reproduce results with the code are available on Zenodo (Perkins et al., 2023b, https://doi.org/10.5281/zenodo.10637644).…”
Section: Data Availability Statementmentioning
confidence: 99%
“…The code and configurations used to produce training data, train ML models, and run FV3GFS simulations are available on Github (Perkins et al, 2023a,…”
Section: Appendix A: Zhao-carr Microphysicsmentioning
confidence: 99%
“…The rise of machine learning (ML, i.e., data-driven models) capabilities has fostered new approaches to improving parameterizations (Gentine et al, 2018). Examples include replacing computationally intensive physical parameterizations with ML emulation (Keller & Evans, 2019;Krasnopolsky et al, 2005Krasnopolsky et al, , 2010Lagerquist et al, 2021;O'Gorman & Dwyer, 2018;Perkins et al, 2023) and training ML against observations (Chen et al, 2023;McGibbon & Bretherton, 2019;Watt-Meyer et al, 2021) or more accurate and computationally intensive parameterizations (Chantry et al, 2021). ML parameterizations for coarse-grid models have been trained on coarsened (coarse-grained) outputs of fine-grid or super-parameterized reference simulations, for example, to predict the effect of the full physics parameterization (Brenowitz & Bretherton, 2019;Han et al, 2020;Rasp et al, 2018;Watt-Meyer et al, 2024;Yuval et al, 2021), or a column-wise correction to the coarse-grid model physics (Bretherton et al, 2022;Clark et al, 2022;Kwa et al, 2023).…”
Section: Introductionmentioning
confidence: 99%