Sampling of sharp posteriors in high dimensions is a challenging problem, especially when gradients of the likelihood are unavailable. In low to moderate dimensions, affine-invariant methods, a class of ensemble-based gradient-free methods, have found success in sampling concentrated posteriors. However, the number of ensemble members must exceed the dimension of the unknown state in order for the correct distribution to be targeted. Conversely, the preconditioned Crank-Nicolson (pCN) algorithm succeeds at sampling in high dimensions, but samples become highly correlated when the posterior differs significantly from the prior. In this article we combine the above methods in two different ways as an attempt to find a compromise. The first method involves inflating the proposal covariance in pCN with that of the current ensemble, whilst the second performs approximately affine-invariant steps on a continually adapting low-dimensional subspace, while using pCN on its orthogonal complement.