We study the fluctuations in the storage capacity of the symmetric binary perceptron, or equivalently, the fluctuations in the combinatorial discrepancy of a Gaussian matrix. Perkins and Xu [16], and Abbe, Li, and Sly [2] recently established a sharp threshold: for some explicit constant Kc, the discrepancy of a Gaussian matrix is Kc + o(1) with probability tending to one, but without a quantitative rate. We sharpen these results significantly. We show the fluctuations around Kc of the discrepancy are at most of order log(n)/n, and provide exponential tail bounds. Up to a logarithmic factor, this yields a tight characterization for the fluctuations of the symmetric perceptron and combinatorial discrepancy.