Speech perception requires the rapid and effortless extraction of meaningful phonetic information from a highly variable acoustic signal. A powerful example of this phenomenon is categorical speech perception, in which a continuum of acoustically varying sounds is transformed into perceptually distinct phoneme categories. Here we show that the neural representation of speech sounds is categorically organized in the human posterior superior temporal gyrus. Using intracranial high-density cortical surface arrays, we found that listening to synthesized speech stimuli varying in small and acoustically equal steps evoked distinct and invariant cortical population response patterns that were organized by their sensitivities to critical acoustic features. Phonetic category boundaries were similar between neurometric and psychometric functions. While speech-sound responses were distributed, spatially discrete cortical loci were found to underlie specific phonetic discrimination. Thus, we demonstrate direct evidence for acoustic-to-higher order phonetic level encoding of speech sounds in human language receptive cortex.
The brain should integrate related but not unrelated information from different senses. Temporal patterning of inputs to different modalities may provide critical information about whether those inputs are related or not. We studied effects of temporal correspondence between auditory and visual streams on human brain activity with functional magnetic resonance imaging (fMRI). Streams of visual flashes with irregularly jittered, arrhythmic timing could appear on right or left, with or without a stream of auditory tones that coincided perfectly when present (highly unlikely by chance), were noncoincident with vision (different erratic, arrhythmic pattern with same temporal statistics), or an auditory stream appeared alone. fMRI revealed blood oxygenation level-dependent (BOLD) increases in multisensory superior temporal sulcus (mSTS), contralateral to a visual stream when coincident with an auditory stream, and BOLD decreases for noncoincidence relative to unisensory baselines. Contralateral primary visual cortex and auditory cortex were also affected by audiovisual temporal correspondence or noncorrespondence, as confirmed in individuals. Connectivity analyses indicated enhanced influence from mSTS on primary sensory areas, rather than vice versa, during audiovisual correspondence. Temporal correspondence between auditory and visual streams affects a network of both multisensory (mSTS) and sensory-specific areas in humans, including even primary visual and auditory cortex, with stronger responses for corresponding and thus related audiovisual inputs.
Human colour vision originates in the cone photoreceptors, whose spatial density peaks in the fovea and declines rapidly into the periphery. For this reason, one expects to find a large representation of the conerich fovea in those cortical locations that support colour perception. Human occipital cortex contains several distinct foveal representations including at least two that extend onto the ventral surface: a region thought to be critical for colour vision. To learn more about these ventral signals, we used functional magnetic resonance imaging to identify visual field maps and colour responsivity on the ventral surface. We found a visual map of the complete contralateral hemifield in a 4 cm 2 region adjacent to ventral V3; the foveal representation of this map is confluent with that of areas V1/2/3. Additionally, a distinct foveal representation is present on the ventral surface situated 3-5 cm anterior from the confluent V1/2/3 foveal representations. This organization is not consistent with the definition of area V8, which assumes the presence of a quarter field representation adjacent to V3v. Comparisons of responses to luminancematched coloured and achromatic patterns show increased activity to the coloured stimuli beginning in area V1 and extending through the new hemifield representation and further anterior in the ventral occipital lobe.
Although color plays a prominent part in our subjective experience of the visual world, the evolutionary advantage of color vision is still unclear [1] [2], with most current answers pointing towards specialized uses, for example to detect ripe fruit amongst foliage [3] [4] [5] [6]. We investigated whether color has a more general role in visual recognition by looking at the contribution of color to the encoding and retrieval processes involved in pattern recognition [7] [8] [9]. Recognition accuracy was higher for color images of natural scenes than for luminance-matched black and white images, and color information contributed to both components of the recognition process. Initially, color leads to an image-coding advantage at the very early stages of sensory processing, most probably by easing the image-segmentation task. Later, color leads to an advantage in retrieval, presumably as the result of an enhanced image representation in memory due to the additional attribute. Our results ascribe color vision a general role in the processing of visual form, starting at the very earliest stages of analysis: color helps us to recognize things faster and to remember them better.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.