Abstract. An algorithm for continuous data assimilation for the two-dimensional Bénard convection problem is introduced and analyzed. It is inspired by the data assimilation algorithm developed for the Navier-Stokes equations, which allows for the implementation of variety of observables: low Fourier modes, nodal values, finite volume averages, and finite elements. The novelty here is that the observed data is obtained for the velocity field alone; i.e. no temperature measurements are needed for this algorithm. We provide conditions on the spatial resolution of the observed data, under the assumption that the observed data is free of noise, which are sufficient to show that the solution of the algorithm approaches, at an exponential rate, the unique exact unknown solution of the Bénard convection problem associated with the observed (finite dimensional projection of) velocity.MSC Subject Classifications: 35Q30, 93C20, 37C50, 76B75, 34D06.
Abstract. We introduce a continuous data assimilation (downscaling) algorithm for the two-dimensional Navier-Stokes equations employing coarse mesh measurements of only one component of the velocity field. This algorithm can be implemented with a variety of finitely many observables: low Fourier modes, nodal values, finite volume averages, or finite elements. We provide conditions on the spatial resolution of the observed data, under the assumption that the observed data is free of noise, which are sufficient to show that the solution of the algorithm approaches, at an exponential rate asymptotically in time, to the unique exact unknown reference solution, of the 2D Navier-Stokes equations, associated with the observed (finite dimensional projection of) velocity.MSC Subject Classifications: 35Q30, 93C20, 37C50, 76B75, 34D06.
It is shown-within a mathematical framework based on the suitably defined scale of sparseness of the super-level sets of the positive and negative parts of the vorticity components, and in the context of a blow-up-type argument-that the ever-resisting 'scaling gap', i.e., the scaling distance between a regularity criterion and a corresponding a priori bound (shortly, a measure of the super-criticality of the 3D Navier-Stokes regularity problem), can be reduced by an algebraic factor; since (independent) fundamental works of Ladyzhenskaya, Prodi and Serrin as well as Kato and Fujita in 1960s, all the reductions have been logarithmic in nature, regardless of the functional set up utilized. More precisely, it is shown that it is possible to obtain an a priori bound that is algebraically better than the energy-level bound, while keeping the corresponding regularity criterion at the same level as all the classical regularity criteria. The mathematics presented was inspired by morphology of the regions of intense vorticity/velocity gradients observed in computational simulations of turbulent flows, as well as by the physics of turbulent cascades and turbulent dissipation.
Analyzing the validity and success of a data assimilation algorithm when some state variable observations are not available is an important problem in meteorology and engineering. We present an improved data assimilation algorithm for recovering the exact full reference solution (i.e. the velocity and temperature) of the 3D Planetary Geostrophic model, at an exponential rate in time, by employing coarse spatial mesh observations of the temperature alone. This provides, in the case of this paradigm, a rigorous justification to an earlier conjecture of Charney which states that temperature history of the atmosphere, for certain simple atmospheric models, determines all other state variables.MSC Subject Classifications: 35Q30, 93C20, 37C50, 76B75, 34D06.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.