In Bayesian inverse problems, 'model error' refers to the discrepancy between the parameterto-observable map that generates the data and the parameter-to-observable map that is used for inference. Model error is important because it can lead to misspecified likelihoods, and thus to incorrect inference. We consider some deterministic approaches for accounting for model error in inverse problems with additive Gaussian observation noise, where the parameter-to-observable map is the composition of a possibly nonlinear parameter-to-state map or 'model' and a linear state-to-observable map or 'observation operator'. Using local Lipschitz stability estimates of posteriors with respect to likelihood perturbations, we bound the symmetrised Kullback-Leibler divergence of the posterior generated by each approach with respect to the posterior associated to the true model and the posterior associated to the wrong model. Our bounds lead to criteria for choosing observation operators that mitigate the effect of model error on the posterior.