The identification of parameters in mathematical models using noisy observations is a common task in uncertainty quantification. We employ the framework of Bayesian inversion: we combine monitoring and observational data with prior information to estimate the posterior distribution of a parameter. Specifically, we are interested in the distribution of a diffusion coefficient of an elliptic PDE. In this setting, the sample space is high-dimensional, and each sample of the PDE solution is expensive. To address these issues we propose and analyse a novel Sequential Monte Carlo (SMC) sampler for the approximation of the posterior distribution. Classical, single-level SMC constructs a sequence of measures, starting with the prior distribution, and finishing with the posterior distribution. The intermediate measures arise from a tempering of the likelihood, or, equivalently, a rescaling of the noise. The resolution of the PDE discretisation is fixed. In contrast, our estimator employs a hierarchy of PDE discretisations to decrease the computational cost. We construct a sequence of intermediate measures by decreasing the temperature or by increasing the discretisation level at the same time. This idea builds on and generalises the multi-resolution sampler proposed in [P.S. Koutsourelakis, J. Comput. Phys., 228 (2009), pp. 6184-6211] where a bridging scheme is used to transfer samples from coarse to fine discretisation levels. Importantly, our choice between tempering and bridging is fully adaptive. We present numerical experiments in 2D space, comparing our estimator to single-level SMC and the multi-resolution sampler.In contrast to deterministic regularisation techniques, the Bayesian approach to inverse problems uses the probabilistic framework of Bayesian inference. Bayesian inference is built on Bayes' Formula in the formulation given by Laplace [34, II.1]. We remark that other formulations are possible, see e.g. the work by Matthies et al. [38]. We make use of the mathematical framework for Bayesian Inverse Problems (BIPs) given by Stuart [48]. Under weak assumptions -which we will give below -one can show that the BIP is well-posed. The solution of the BIP is the conditional probability measure of the unknown parameter given the observations.The Bayesian framework is very general and can handle different types of forward models. However, in this work we consider PDE-based forward models, and in particular an elliptic PDE. The exact solution of the associated BIP is often inaccessible for two reasons: (i) there is no closed form expression for the posterior measure, and (ii) the underlying PDE cannot be solved analytically. We focus on (i), and study efficient approximations to the full posterior measure. Alternatively, one could also only approximate the expectation of output quantities of interest with respect to the posterior measure, or estimate the model evidence, the normalization constant of the posterior measure.Typically, BIPs are approached with sampling based methods, such as Markov Chain Monte Carlo (M...