Markov chain Monte Carlo methods are a powerful and commonly used family of numerical methods for sampling from complex probability distributions. As applications of these methods increase in size and complexity, the need for efficient methods increases. In this paper, we present a particle ensemble algorithm. At each iteration, an importance sampling proposal distribution is formed using an ensemble of particles. A stratified sample is taken from this distribution and weighted under the posterior, a state-of-the-art ensemble transport resampling method is then used to create an evenly weighted sample ready for the next iteration. We demonstrate that this ensemble transport adaptive importance sampling (ETAIS) method outperforms MCMC methods with equivalent proposal distributions for low dimensional problems, and in fact shows better than linear improvements in convergence rates with respect to the number of ensemble members. We also introduce a new resampling strategy, multinomial transformation (MT), which while not as accurate as the ensemble transport resampler, is substantially less costly for large ensemble sizes, and can then be used in conjunction with ETAIS for complex problems. We also focus on how algorithmic parameters regarding the mixture proposal can be quickly tuned to optimise performance. In particular, we demonstrate this methodology's superior sampling for multimodal problems, such as those arising from inference for mixture models, and for problems with expensive likelihoods requiring the solution of a differential equation, for which speed-ups of orders of magnitude are demonstrated. Likelihood evaluations of the ensemble could be computed in a distributed manner, suggesting that this methodology is a good candidate for parallel Bayesian computations. allow us to sample from complex probability distributions which we would not be able to sample from directly. In particular, these methods have revolutionised the way in which inverse problems can be tackled, allowing full posterior sampling when using a Bayesian framework. However, this often comes at a very high cost, with a very large number of iterations required in order for the empirical approximation of the posterior to be considered good enough. As the cost of computing likelihoods can be extremely large, this means that many problems of interest are simply computationally intractable. This problem has been tackled in a variety of different ways. One approach is to construct increasingly complex MCMC methods which are able to use the structure of the posterior to make more intelligent proposals, leading to more thorough exploration of the posterior with fewer iterations. For example, the Hamiltonian or Hybrid Monte Carlo (HMC) algorithm uses gradient information and symplectic integrators in order to make very large moves in state with relatively high acceptance probabilities [43]. Non-reversible methods are also becoming quite popular as they can improve mixing [3]. Riemann manifold Monte Carlo methods exploit the Riemann geometry of t...