In this note we consider the robustness of posterior measures occuring in Bayesian inference w.r.t. perturbations of the prior measure and the log-likelihood function. This extends the well-posedness analysis of Bayesian inverse problems. In particular, we prove a general local Lipschitz continuous dependence of the posterior on the prior and the log-likelihood w.r.t. various common distances of probability measures. These include the Hellinger and Wasserstein distance and the Kullback-Leibler divergence. We only assume the boundedness of the likelihoods and measure their perturbations in an L p -norm w.r.t. the prior. The obtained robustness yields under mild assumptions the well-posedness of Bayesian inverse problems, in particular, a well-posedness w.r.t. the Wasserstein distance which is missing in the existing literature. Moreover, our results indicate an increasing sensitivity of Bayesian inference as the posterior becomes more concentrated, e.g., due to more or more accurate data. This confirms and extends previous observations made in the sensitivity analysis of Bayesian inference.