We propose and analyze deterministic multilevel approximations for Bayesian inversion of operator equations with uncertain distributed parameters, subject to additive gaussian measurement data. The algorithms use a multilevel (ML) approach based on deterministic, higher order quasi-Monte Carlo (HoQMC) quadrature for approximating the high-dimensional expectations, which arise in the Bayesian estimators, and a Petrov-Galerkin (PG) method for approximating the solution to the underlying partial differential equation (PDE). This extends the previous single-level approach from [J. Dick, R. N. Gantner, Q. T. Le Gia and Ch. Schwab, Higher order Quasi-Monte Carlo integration for Bayesian Estimation. Report 2016-13, Seminar for Applied Mathematics, ETH Zürich (in review)].Compared to the single-level approach, the present convergence analysis of the multilevel method requires stronger assumptions on holomorphy and regularity of the countably-parametric uncertaintyto-observation maps of the forward problem. As in the single-level case and in the affine-parametric case analyzed in [ J. Dick, F.Y. Kuo, Q. T. Le Gia and Ch. Schwab, Multi-level higher order QMC Galerkin discretization for affine parametric operator equations. Accepted for publication in SIAM J. Numer. Anal., 2016], we obtain sufficient conditions which allow us to achieve arbitrarily high, algebraic convergence rates in terms of work, which are independent of the dimension of the parameter space. The convergence rates are limited only by the spatial regularity of the forward problem, the discretization order achieved by the Petrov Galerkin discretization, and by the sparsity of the uncertainty parametrization.We provide detailed numerical experiments for linear elliptic problems in two space dimensions, with s = 1024 parameters characterizing the uncertain input, confirming the theory and showing that the ML HoQMC algorithms outperform, in terms of error vs. computational work, both multilevel Monte Carlo (MLMC) methods and single-level (SL) HoQMC methods.