Global Bayesian optimization (GBO) is a derivative-free optimization method that is used widely in the techindustry to optimize objective functions that are expensive to evaluate, numerically or otherwise. We discuss the use of GBO in ensemble data assimilation (DA), where the goal is to update the state of a numerical model in view of noisy observations. Specifically, we consider three tasks: (i) the estimation of model parameters; (ii) the tuning of localization and inflation in ensemble DA; (iii) doing both, i.e. estimating model parameters while simultaneously tuning the localization and inflation of the ensemble DA. For all three tasks, the GBO works 'offline', i.e. a set of 'training' observations are used within GBO to determine appropriate model or localization/inflation parameters, which are subsequently deployed within an ensemble DA system. Because of the offline nature of the technique, GBO can easily be combined with existing DA systems and it can effectively decouple (nearly) linear/Gaussian aspects of a problem from highly nonlinear/ non-Gaussian ones. We illustrate the use of GBO in simple numerical experiments with the classical Lorenz problems. Our main goals are to introduce GBO in the context of ensemble DA and to spark an interest in GBO and its uses for streamlining important tasks in ensemble DA.