2021
DOI: 10.5194/gmd-2021-71
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Towards physically consistent data-driven weather forecasting: Integrating data assimilation with equivariance-preserving spatial transformers in a case study with ERA5

Abstract: Abstract. There is growing interest in data-driven weather prediction (DDWP), for example using convolutional neural networks such as U-NETs that are trained on data from models or reanalysis. Here, we propose 3 components to integrate with commonly used DDWP models in order to improve their physical consistency and forecast accuracy. These components are 1) a deep spatial transformer added to the latent space of the U-NETs to preserve a property called equivariance, which is related to correctly capturing rot… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(6 citation statements)
references
References 67 publications
0
6
0
Order By: Relevance
“…This approach, however, comes at the cost of lower precision and may not be compatible with all implementations of the update algorithm. Looking forward, researchers in other fields have begun experimenting with machine learning, outsourcing at least some computations to neural networks trained on a large set of model realizations (e.g., Chattopadhyay et al., 2021 ). However, this technique is relatively new and has not yet been adapted for this application.…”
Section: Discussionmentioning
confidence: 99%
“…This approach, however, comes at the cost of lower precision and may not be compatible with all implementations of the update algorithm. Looking forward, researchers in other fields have begun experimenting with machine learning, outsourcing at least some computations to neural networks trained on a large set of model realizations (e.g., Chattopadhyay et al., 2021 ). However, this technique is relatively new and has not yet been adapted for this application.…”
Section: Discussionmentioning
confidence: 99%
“…Physics‐informed ML (or knowledge/theory‐guided ML) couples physical knowledge to the ML architecture and offers one approach to enhance ML generalizability and trust (e.g., Gentine et al., 2021; Irrgang et al., 2021; Karpatne et al., 2017; Kashinath et al., 2021; Raissi et al., 2019a). Here, physical conservation laws are incorporated into ML algorithms, either by constraining the loss function (soft constraints, also called regularization; e.g., Brenowitz et al., 2020; Harder et al., 2022) or by more strictly enforcing conserved properties (hard constraints; e.g., Beucler et al., 2021a; Chattopadhyay et al., 2021). Variations of physics‐informed ML include designing “climate‐invariant” algorithms by rescaling inputs and outputs to avoid extrapolation (Beucler et al., 2021b) or incorporating equations governing the dynamics to build hybrid ML algorithms (e.g., Pathak et al., 2018b; Raissi et al., 2019a).…”
Section: Data‐driven Methods: the Emergence Of Machine Learningmentioning
confidence: 99%
“…These methods both involve careful design based on expert knowledge of the possible regimes within the climate system, much like a traditional physics‐based model, and once built, can also harness the computational speed‐ups offered by ML. Similarly, physics‐informed or hybrid approaches as discussed above offer a way to incorporate interpretable steps into a model (e.g., by enforcing conservation laws, Beucler et al., 2021a; Chattopadhyay et al., 2021). Alternatively, techniques such as data‐driven equation discovery can learn closed‐form equations for SGS parameterizations, which are more easily interpreted (e.g., Mojgani et al., 2022; Raissi et al., 2019b; Zanna & Bolton, 2020).…”
Section: Data‐driven Methods: the Emergence Of Machine Learningmentioning
confidence: 99%
“…Furthermore, a sigma‐point ensemble Kalman algorithm and the U‐STN model were also integrated in Chattopadhyay et al. (2021) to provide stable, accurate DA cycles for geopotential height prediction. It showed that the gain from applying DA to an ML‐based surrogate model would be most significant when the observations are noisy and sparse.…”
Section: Related Workmentioning
confidence: 99%