We consider the problem of partitioning n integers chosen randomly between 1 and 2 m into two subsets such that the discrepancy, the absolute value of the difference of their sums, is minimized. A partition is called perfect if the optimum discrepancy is 0 when the sum of all n integers in the original set is even, or 1 when the sum is odd. Parameterizing the random problem in terms of κ = m/n, we prove that the problem has a sharp threshold at κ = 1, in the sense that for κ < 1, there are many perfect partitions with probability tending to 1 as n → ∞, while for κ > 1, there are no perfect partitions with probability tending to 1. Moreover, we show that the derivative of the so-called entropy is discontinuous at κ = 1.We also determine the scaling window about the transition point: κn = 1 − (2n) −1 log 2 n + λn/n, by showing that the probability of a perfect partition tends to 0, 1, or some explicitly computable p(λ) ∈ (0, 1), depending on whether λn tends to −∞, ∞, or λ ∈ (−∞, ∞), respectively. For λn → −∞ fast enough, we show that the number of perfect partitions is Gaussian in the limit. For λn → ∞, we prove that with high probability the optimum partition is unique, and that the optimum discrepancy is Θ(λn). Within the window, i.e., if |λn| is bounded, we prove that the optimum discrepancy is bounded. Both for λn → ∞ and within the window, the limiting distribution of the (scaled) discrepancy is found.