An example abstract: In this paper, we propose a generic algorithm to train machine learning-based subgrid parametrizations online, i.e., with a posteriori loss functions, but for non-differentiable numerical solvers. The proposed approach leverages a neural emulator to approximate the reduced state-space solver, which is then used to allow gradient propagation through temporal integration steps. We apply this methodology on a single layer quasi-geostrophic system with topography, known to be highly unstable in around 500 temporal iterations with offline strategies. Using our algorithm, we are able to train a parametrization that recovers most of the benefits of online strategies without having to compute the gradient of the original solver. It is demonstrated that training the neural emulator and parametrization components separately with different loss quantities is necessary in order to minimize the propagation of approximation biases. Experiments on emulator architectures with different complexities also indicates that emulator performance is key in order to learn an accurate parametrization. This work is a step towards learning parametrization with online strategies for weather models.
Contents
Document and math typesetting
This is an extract from my last submitted paper.
In turbulent flows, the energy cascade drives energy from the large scales to the small scales until molecular viscous dissipation (forward-scatter), but the inverse transfer called backscatter in which energy is transferred from the small scales back to the large scales (Lesieur & Metais, 1996) is also in play, particularly for geophysical flows. This is explained by the relative dominance of the Coriolis force which creates vortical structures that appear two-dimensional. Historically, developing SGS models that account for backscatter is a challenging task (Piomelli et al., 1991; Schumann, 1995; Liu et al., 2011). Indeed, an overprediction of backscatter that can not be compensated by eddy-viscosity will lead to an accumulation of small-scale energy causing simulations to become numerically unstable. In two-dimensional flows, we observe a dual cascade composed of “forward” enstrophy and “inverse” energy, in a : statistical sense. As a consequence, a large number of SGS models have been proposed in particular for geophysical flows (see Danilov et al. 2019 for a review) with well-documented configurations and performance metrics (Fox-Kemper & Menemenlis, 2008). SGS modeling is also a key issue for the simulation of ocean and atmosphere dynamics because of the large range of motions involved (Jansen et al., 2015; Juricke et al., 2020; Frederiksen et al., 2012). As a case study framework, we consider barotropic QG flows. While providing an approximate yet representative system for rotating stratified flows found in the atmosphere and ocean dynamics, it involves relatively complex SGS features that make the learning problem non-trivial. As such, QG flows are regarded as an ideal playground to explore and assess the relevance of machine learning strategies for SGS models in geophysical turbulence. The governing equations of direct vorticity for the QG model with bottom topography are
Other relevant elements
This is an image also extracted from my last submitted paper.

The figure elements can also be described in a table
Left | Middle | Right |
---|---|---|
Direct vorticity | Reduced vorticity | SGS term |
or as a list
- Direct vorticity .
- Reduced vorticity .
- SGS term .
This is a sample "hidden" section with Python code (taken from NumPy's documentation)
For example, you can create an array from a regular Python list or tuple using the array function. The type of the resulting array is deduced from the type of the elements in the sequences.
import numpy as np
a = np.array([1, 2, 3])