2019
DOI: 10.1029/2018ms001472
|View full text |Cite
|
Sign up to set email alerts
|

Applications of Deep Learning to Ocean Data Inference and Subgrid Parameterization

Abstract: Oceanographic observations are limited by sampling rates, while ocean models are limited by finite resolution and high viscosity and diffusion coefficients. Therefore, both data from observations and ocean models lack information at small and fast scales. Methods are needed to either extract information, extrapolate, or upscale existing oceanographic data sets, to account for or represent unresolved physical processes. Here we use machine learning to leverage observations and model data by predicting unresolve… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

4
265
0
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 307 publications
(270 citation statements)
references
References 54 publications
4
265
0
1
Order By: Relevance
“…Regularization is critical for more complex machine learning models, so that they can converge to optimal and robust configurations in large parameter spaces. Machine learning for parameterizations has been considered since (Krasnopolsky et al, 2005), and recently, multiple groups have begun developing new parameterizations for a variety of processes (Bolton & Zanna, 2019;Gentine et al, 2018;Rasp et al, 2018;Schneider et al, 2017). The most common regularization in existing machine learning parameterization approaches has been multitask learning (Caruana, 1997), in which the machine learning model predicts multiple correlated values simultaneously and learns to preserve the correlations.…”
Section: Introductionmentioning
confidence: 99%
“…Regularization is critical for more complex machine learning models, so that they can converge to optimal and robust configurations in large parameter spaces. Machine learning for parameterizations has been considered since (Krasnopolsky et al, 2005), and recently, multiple groups have begun developing new parameterizations for a variety of processes (Bolton & Zanna, 2019;Gentine et al, 2018;Rasp et al, 2018;Schneider et al, 2017). The most common regularization in existing machine learning parameterization approaches has been multitask learning (Caruana, 1997), in which the machine learning model predicts multiple correlated values simultaneously and learns to preserve the correlations.…”
Section: Introductionmentioning
confidence: 99%
“…Recent advances in artificial intelligence have revolutionized how problems in various domains of business and science are approached (Goodfellow et al, 2016;LeCun et al, 2015). For example, in climate science, using machine learning techniques to accurately and efficiently represent unresolved physical processes in the atmosphere and ocean has produced promising results (Brenowitz & Bretherton, 2018;Bolton & Zanna, 2019;O'Gorman & Dwyer, 2018;Rasp et al, 2018;Salehipour & Peltier, 2019) and has the potential to significantly improve climate modeling and long-term climate projections in the coming years (Chattopadhyay et al, 2019;Gentine et al, 2018;Reichstein et al, 2019;Schneider et al, 2017). Moreover, deep learning techniques have been very successful in predicting some types of sequential data (Goodfellow et al, 2016).…”
Section: Introductionmentioning
confidence: 99%
“…Wn,f indicates the fraction of fine grid box f within coarse grid box n, and the vertical level of the field is indicated by index k. Note that this is one choice of coarse‐graining procedure, with alternatives including Gaussian (Bolton and Zanna, ) or spectral filters (Shutts and Pallares, ).…”
Section: The Coarse‐graining Frameworkmentioning
confidence: 97%
“…The coarse (fine) grid box is identified by the index n (f ). W n,f indicates the fraction of fine grid box f within coarse grid box n, and the vertical level of the field is indicated by index k. Note that this is one choice of coarse-graining procedure, with alternatives including Gaussian (Bolton and Zanna, 2019) or spectral filters (Shutts and Pallares, 2018). The fine-and coarse-resolution datasets are defined on model levels, and interpolation must also be performed in the vertical.…”
Section: Coarse-graining the Cascade Datasetmentioning
confidence: 99%