2023
DOI: 10.1029/2023ms003890
|View full text |Cite
|
Sign up to set email alerts
|

Parameterizing Vertical Mixing Coefficients in the Ocean Surface Boundary Layer Using Neural Networks

Aakash Sane,
Brandon G. Reichl,
Alistair Adcroft
et al.

Abstract: Vertical mixing parameterizations in ocean models are formulated on the basis of the physical principles that govern turbulent mixing. However, many parameterizations include ad hoc components that are not well constrained by theory or data. One such component is the eddy diffusivity model, where vertical turbulent fluxes of a quantity are parameterized from a variable eddy diffusion coefficient and the mean vertical gradient of the quantity. In this work, we improve a parameterization of vertical mixing in th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
7
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
5

Relationship

4
1

Authors

Journals

citations
Cited by 10 publications
(7 citation statements)
references
References 68 publications
0
7
0
Order By: Relevance
“…(2023). Third, a promising approach would be to propose an ML model predicting the free parameters in physical parameterizations (Sane et al., 2023; Zhu et al., 2022) and thus automatically satisfy conservation properties. However, we note that a suitable form of parameterization of mesoscale eddy fluxes remains to be established.…”
Section: Conclusion and Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…(2023). Third, a promising approach would be to propose an ML model predicting the free parameters in physical parameterizations (Sane et al., 2023; Zhu et al., 2022) and thus automatically satisfy conservation properties. However, we note that a suitable form of parameterization of mesoscale eddy fluxes remains to be established.…”
Section: Conclusion and Discussionmentioning
confidence: 99%
“…Second, the divergence operator can be implemented as a final convolutional layer in GAN and VAE models similarly to Zanna and Bolton (2020) and Srinivasan et al (2023). Third, a promising approach would be to propose an ML model predicting the free parameters in physical parameterizations (Sane et al, 2023;Zhu et al, 2022) and thus automatically satisfy conservation properties. However, we note that a suitable form of parameterization of mesoscale eddy fluxes remains to be established.…”
Section: Conclusion and Discussionmentioning
confidence: 99%
“…In the context of dynamical climate models, DL algorithms have proven effective tools for deriving model parameterizations directly from numerical simulations. For example, many past studies have focused on learning subgrid parameterizations from high resolution experiments and/or observations of the ocean (Bolton & Zanna, 2019; Sane et al., 2023; Zanna & Bolton, 2020; Zhu et al., 2022), atmosphere (Brenowitz & Bretherton, 2018; Gentine et al., 2018; O’Gorman & Dwyer, 2018; Rasp et al., 2018; P. Wang et al., 2022; Yuval & O’Gorman, 2020), and sea ice (Finn et al., 2023). In the context of DA‐based approaches, some recent studies have relied on iterative sequences of DA and ML to infer unresolved scale parameterizations from sparse and noisy observations (Brajard et al., 2021), or to learn state‐dependent model error from analysis increments (Farchi et al., 2021) and nudging tendencies (Bretherton et al., 2022; Watt‐Meyer et al., 2021), while others have combined DA with equation discovery to extract interpretable structural model errors (Mojgani et al., 2022).…”
Section: Introductionmentioning
confidence: 99%
“…One approach to this language barrier is to “port” the code, translating from one language to another (e.g., the work in Sane et al. (2023)). In this case, this would entail rewriting some machine learning libraries in Fortran.…”
Section: Introductionmentioning
confidence: 99%
“…MOM6, however, like most of large-scale scientific computation models, is written in a low-level programming language Fortran. One approach to this language barrier is to "port" the code, translating from one language to another (e.g., the work in Sane et al (2023)). In this case, this would entail rewriting some machine learning libraries in Fortran.…”
mentioning
confidence: 99%