The ability to precisely compute the location and direction of sounds in external space is a crucial perceptual process to efficiently interact with dynamic environments. Little is known, however, about how the human brain implements spatial hearing. In our study, we used fMRI to characterize the brain activity of humans listening to left, right, up and down moving as well as static sounds. Whole brain univariate results contrasting moving and static sounds varying in their location revealed a robust functional preference for auditory motion in bilateral human Planum Temporale (hPT). Importantly, multivariate pattern classification analysis showed that hPT contains information about both auditory motion directions and, to a lesser extent, sound source locations. More precisely, we observed that our classifier successfully decoded opposite axes of motion (vertical versus horizontal) but was less able to classify opposite within-axis direction (left versus right or up versus down); reminiscent of the axis of motion organization observed in the middle-temporal cortex for vision. Further multivariate analyses demonstrated that even if motion direction and location rely on partially shared pattern geometries in PT, the responses elicited by static and moving sounds were however highly distinct. Altogether our results demonstrate that human PT codes for auditory motion and location but that the underlying neural computation linked to motion processing is more reliable and partially distinct from the one supporting sound source location.All rights reserved. No reuse allowed without permission.