For S ⊆ {0, 1}n , a Boolean function f : S → {−1, 1} is a halfspace over S if there exist w ∈ R n and θ ∈ R such that f (x) = sign(w · x − θ) for all x ∈ S. We give bounds on the size of integer weights w1, . . . , wn ∈ Z that are required to represent halfspaces over Hamming balls centered at 0 n , i.e. halfspaces over S = {0, 1}n ≤k def = {x ∈ {0, 1} n : x1 + · · · + xn ≤ k}. Such weight bounds for halfspaces over Hamming balls have immediate consequences for the performance of learning algorithms in the increasingly common scenario of learning from very high-dimensional categorical examples which are such that only a small number of features are active in each example.We give upper and lower bounds on weight both for exact representation (when sign(w · x − θ) must equal f (x) for every x ∈ S) and for ε-approximate representation (when sign(w · x − θ) may disagree with f (x) for up to an ε fraction of points x ∈ S). Our results show that extremal bounds for exact representation are qualitatively rather similar whether the domain is all of {0, 1} n or the Hamming ball {0, 1} n ≤k , but extremal bounds for approximate representation are qualitatively very different between these two domains.