Behavioral and neurophysiological effects of word imageability and concreteness remain a topic of central interest in cognitive neuroscience and could provide essential clues for understanding how the brain processes conceptual knowledge. We examined these effects using event-related functional magnetic resonance imaging while participants identified concrete and abstract words. Relative to nonwords, concrete and abstract words both activated a left-lateralized network of multimodal association areas previously linked with verbal semantic processing. Areas in the left lateral temporal lobe were equally activated by both word types, whereas bilateral regions including the angular gyrus and the dorsal prefrontal cortex were more strongly engaged by concrete words. Relative to concrete words, abstract words activated left inferior frontal regions previously linked with phonological and verbal working memory processes. The results show overlapping but partly distinct neural systems for processing concrete and abstract concepts, with greater involvement of bilateral association areas during concrete word processing, and processing of abstract concepts almost exclusively by the left hemisphere.
Physiological studies of auditory perception have not yet clearly distinguished sensory from decision processes. In this experiment, human participants identified speech sounds masked by varying levels of noise while blood oxygenation signals in the brain were recorded with functional magnetic resonance imaging (fMRI). Accuracy and response time were used to characterize the behavior of sensory and decision components of this perceptual system. Oxygenation signals in a cortical subregion just anterior and lateral to primary auditory cortex predicted accuracy of sound identification, whereas signals in an inferior frontal region predicted response time. Our findings provide neurophysiological evidence for a functional distinction between sensory and decision mechanisms underlying auditory object identification. The present results also indicate a link between inferior frontal lobe activation and response-selection processes during auditory perception tasks.
In previous functional neuroimaging studies, left anterior temporal and temporal-parietal areas responded more strongly to sentences than to randomly ordered lists of words. The smaller response for word lists could be explained by either (1) less activation of syntactic processes due to the absence of syntactic structure in the random word lists or (2) less activation of semantic processes resulting from failure to combine the content words into a global meaning. To test these two explanations, we conducted a functional magnetic resonance imaging study in which word order and combinatorial word meaning were independently manipulated during auditory comprehension. Subjects heard six different stimuli: normal sentences, semantically incongruent sentences in which content words were randomly replaced with other content words, pseudoword sentences, and versions of these three sentence types in which word order was randomized to remove syntactic structure. Effects of syntactic structure (greater activation to sentences than to word lists) were observed in the left anterior superior temporal sulcus and left angular gyrus. Semantic effects (greater activation to semantically congruent stimuli than either incongruent or pseudoword stimuli) were seen in widespread, bilateral temporal lobe areas and the angular gyrus. Of the two regions that responded to syntactic structure, the angular gyrus showed a greater response to semantic structure, suggesting that reduced activation for word lists in this area is related to a disruption in semantic processing. The anterior temporal lobe, on the other hand, was relatively insensitive to manipulations of semantic structure, suggesting that syntactic information plays a greater role in driving activation in this area.
The temporal lobe in the left hemisphere has long been implicated in the perception of speech sounds. Little is known, however, regarding the specific function of different temporal regions in the analysis of the speech signal. Here we show that an area extending along the left middle and anterior superior temporal sulcus (STS) is more responsive to familiar consonant-vowel syllables during an auditory discrimination task than to comparably complex auditory patterns that cannot be associated with learned phonemic categories. In contrast, areas in the dorsal superior temporal gyrus bilaterally, closer to primary auditory cortex, are activated to the same extent by the phonemic and nonphonemic sounds. Thus, the left middle/anterior STS appears to play a role in phonemic perception. It may represent an intermediate stage of processing in a functional pathway linking areas in the bilateral dorsal superior temporal gyrus, presumably involved in the analysis of physical features of speech and other complex non-speech sounds, to areas in the left anterior STS and middle temporal gyrus that are engaged in higher-level linguistic processes.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.