We analyse the performance of different limited-memory quasi-Newton methods for unconstrained DNS-based optimization. Optimization based on Direct Numerical Simulation (DNS) of turbulent flows is extremely expensive, as functional and gradient evaluations require the simulation of Navier-Stokes and adjoint Navier-Stokes equations with high space and time resolution. Nowadays, simple and robust nonlinear conjugate gradient methods are generally used for DNS-based optimal control, as they do not require much memory overhead in a large control space. In the current study, we investigate the use of quasi-Newton methods instead. They combine a cheap approximation of the Hessian to improve step direction and step length, leading to faster convergence of the optimization. Since control spaces are often large in DNS-based optimization, we investigate only limited-memory quasi-Newton methods. Three methods are studied, i.e., the discrete truncated Newton method, the limited-memory BFGS method, and the damped L-BFGS method. The latter method is designed for constrained optimization, but can also address unconstrained problems. Furthermore, the damped L-BFGS method only requires the Armijo condition in the line search, not the Wolfe conditions, limiting expensive functional and gradient evaluations. We investigate the combination of the three quasi-Newton methods with three different line-search methods either based on bisection, quadratic interpolation, or cubic interpolation. Initially, all possible combinations are evaluated in a test problem that is based on the extended
The use of PDE-constrained optimization techniques in combination with transient three-dimensional turbulent flow simulations such as Direct Numerical Simulation (DNS) or Large-Eddy Simulation (LES) involves large computational cost and memory resources. To date, the minimization of a DNS-based cost functional is typically achieved by applying classical single-grid gradient-based iterative methods of quasi-Newton or non-linear conjugate gradient type. In the current study, a multigrid optimization (MG/OPT) strategy is investigated in order to speed up gradient-based algorithms designed for large scale optimization problems. The method employs a hierarchy of optimization problems defined on different representation levels. It aims to reduce the computational resources associated with the cost functional improvement on the finest level. We apply the MG/OPT method in the context of direct numerical simulations of a fully developed channel flow problem. The performance of the multigrid optimization technique is compared against the single-grid optimization method in terms of equivalent function and gradient evaluations. Also the influence of the optimization problem properties and algorithmic parameters are investigated. It is found that, in some cases, the MG/OPT method accelerates the single-grid damped L-BFGS method by a factor of four.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.