Standard backward error analyses for numerical linear algebra algorithms provide worst-case bounds that can significantly overestimate the backward error. Our recent probabilistic error analysis, which assumes rounding errors to be independent random variables [SIAM J. Sci. Comput., 41 (2019), pp. A2815--A2835], contains smaller constants but its bounds can still be pessimistic. We perform a new probabilistic error analysis that assumes both the data and the rounding errors to be random variables and assumes only mean independence. We prove that for data with zero or small mean we can relax the existing probabilistic bounds of order \surd nu to much sharper bounds of order u, which are independent of n. Our fundamental result is for summation and we use it to derive results for inner products, matrix--vector products, and matrix--matrix products. The analysis answers the open question of why random data distributed on [ - 1, 1] leads to smaller error growth for these kernels than random data distributed on [0,1]. We also propose a new algorithm for multiplying two matrices that transforms the rows of the first matrix to have zero mean and we show that it can achieve significantly more accurate results than standard matrix multiplication.