. How do neurons in the inferior colliculus (IC) encode the spatial location of sound? We have addressed this question using a virtual auditory environment. For this purpose, the individual head-related transfer functions (HRTFs) of 18 guinea pigs were measured under free-field conditions for 122 locations covering the upper hemisphere. From 257 neurons, 94% responded to the short (50-ms) white noise stimulus at 70 dB sound pressure level (SPL). Out of these neurons, 80% were spatially tuned with a receptive field that is smaller than a hemifield (at 70 dB). The remainder responded omnidirectionally or showed fractured receptive fields. The majority of the neurons preferred directions in the contralateral hemisphere. However, preference for front or rear positions and high elevations occurred frequently. For stimulation at 70 dB SPL, the average diameter of the receptive fields, based on half-maximal response, was less than a quarter of the upper hemisphere. Neurons that preferred frontal directions responded weakly or showed no response to posterior directions and vice versa. Hence, front/back discrimination is present at the single-neuron level in the IC. When nonindividual HRTFs were used to create the stimuli, the spatial receptive fields of most neurons became larger, split into several parts, changed position, or the response became omnidirectional. Variation of absolute sound intensity had little effect on the preferred directions of the neurons over a range of 20 to 40 dB above threshold. With increasing intensity, most receptive fields remained constant or expanded. Furthermore, we tested the influence of binaural decorrelation and stimulus bandwidth on spatial tuning. The vast majority of neurons with a low characteristic frequency (Ͻ2.5 kHz) lost spatial tuning under stimulation with binaurally uncorrelated noise, whereas high-frequency units were mostly unaffected. Most neurons that showed spatial tuning under broadband stimulation (white noise and 1 octave wide noise) turned omnidirectional when stimulated with 1/3 octave wide noise.
I N T R O D U C T I O NExternal sounds are diffracted and partly shadowed by the head and body and then are further modified as they enter the external ear. Thus spectral modifications, which are specific with respect to the directions of the sound sources relative to the listener's head, are imposed on the incoming sound signals on their way to the eardrums. It is possible to measure these head-related transfer functions (HRTFs) by recording signals in the ear canal after presentation of sounds at different locations (for review, see Blauert 1996). Presenting sounds convolved with the HRTFs through earphones can result in percepts that are indistinguishable from the original external sounds. Thus these virtual acoustic space signals contain all information necessary for localization: interaural information [interaural level difference (ILD) and interaural time difference (ITD)] as well as monaural spectral cues.Because the external ear can be described as a linear time...