An (n, k)-Poisson Multinomial Distribution (PMD) is the distribution of the sum of n independent random vectors supported on the set B k = {e 1 , . . . , e k } of standard basis vectors in R k . We prove a structural characterization of these distributions, showing that, for all ε > 0, any (n, k)-Poisson multinomial random vector is ε-close, in total variation distance, to the sum of a discretized multidimensional Gaussian and an independent (poly(k/ε), k)-Poisson multinomial random vector. Our structural characterization extends the multi-dimensional CLT of [VV11], by simultaneously applying to all approximation requirements ε. In particular, it overcomes factors depending on log n and, importantly, the minimum eigenvalue of the PMD's covariance matrix.We use our structural characterization to obtain an ε-cover, in total variation distance, of the set of all (n, k)-PMDs, significantly improving the cover size of [DP08, DP15], and obtaining the same qualitative dependence of the cover size on n and ε as the k = 2 cover of [DP09, DP14]. We further exploit this structure to show that (n, k)-PMDs can be learned to within ε in total variation distance fromÕ k (1/ε 2 ) samples, which is near-optimal in terms of dependence on ε and independent of n. In particular, our result generalizes the single-dimensional result of [DDS12] for Poisson binomials to arbitrary dimension. Finally, as a corollary of our results on PMDs, we give aÕ k (1/ε 2 ) sample algorithm for learning (n, k)-sums of independent integer random variables (SIIRVs), which is near-optimal for constant k.