Many interesting problems in fields ranging from telecommunications to computational biology can be formalized in terms of large underdetermined systems of linear equations with additional constraints or regularizers. One of the most studied, the compressed sensing problem, consists in finding the solution with the smallest number of non-zero components of a given system of linear equations y = Fw for known measurement vector y and sensing matrix F. Here, we will address the compressed sensing problem within a Bayesian inference framework where the sparsity constraint is remapped into a singular prior distribution (called Spike-and-Slab or Bernoulli-Gauss). A solution to the problem is attempted through the computation of marginal distributions via expectation propagation, an iterative computational scheme originally developed in statistical physics. We will show that this strategy is more accurate for statistically correlated measurement matrices. For computational strategies based on the Bayesian framework such as variants of belief propagation, this is to be expected, as they implicitly rely on the hypothesis of statistical independence among the entries of the sensing matrix. Perhaps surprisingly, the method outperforms uniformly all the other state-ofthe-art methods in our tests.