Accumulating evidence has suggested the existence of a human action recognition system involving inferior frontal, parietal, and superior temporal regions that may participate in both the perception and execution of actions. However, little is known about the specificity of this system in response to different forms of human action. Here we present data from PET neuroimaging studies from passive viewing of three distinct action types, intransitive self-oriented actions (e.g., stretching, rubbing one's eyes, etc.), transitive object-oriented actions (e.g., opening a door, lifting a cup to the lips to drink), and the abstract, symbolic actions-signs used in American Sign Language. Our results show that these different classes of human actions engage a frontal/parietal/STS human action recognition system in a highly similar fashion. However, the results indicate that this neural consistency across motion classes is true primarily for hearing subjects. Data from deaf signers shows a non-uniform response to different classes of human actions. As expected, deaf signers engaged left-hemisphere perisylvian language areas during the perception of signed language signs. Surprisingly, these subjects did not engage the expected frontal/parietal/STS circuitry during passive viewing of nonlinguistic actions, but rather reliably activated middle-occipital temporal-ventral regions which are known to participate in the detection of human bodies, faces, and movements. Comparisons with data from hearing subjects establish statistically significant contributions of middle-occipital temporal-ventral during the processing of non-linguistic actions in deaf signers. These results suggest that during human motion processing, deaf individuals may engage specialized neural systems that allow for rapid, online differentiation of meaningful linguistic actions from non-linguistic human movements.
Children with and without behavioral dichotic left-ear deficits participated in an event-related potential study with quasidichotic presentations of familiar fairy tale segments. Electrical activity was recorded from the scalp while the children listened for semantically and/or syntactically anomalous words from either the right side or the left side while competing segments of the fairy tale were simultaneously presented from the opposite side. Latencies and amplitudes were averaged for each target condition within the group with dichotic left-ear deficits (LED) and the group with normal dichotic listening performance (WNL). Individual global field power waveforms and topographic brain maps were generated for the average response in each of the two listening conditions, target right and target left. Cross-correlations were performed on the grand averaged global field power waveforms to measure the degree of synchrony between target right and target left responses in both groups. Integration functions were performed to compare the accumulated sum of voltages during target (right and left) and control (right and left) conditions. WNL children produced typical ERP responses to the target words in both target right and target left conditions. Responses from LED children were at delayed latencies in the target left condition and were at reduced amplitudes in both target conditions. Topographic brain maps revealed more lateralized scalp distributions and greater activation of frontal regions in LED children in the target left condition. Cross-correlational and integration function results demonstrated interaural asymmetries in responses from the LED children. Overall results suggest that slowed neural conduction times, poor interhemispheric transfer of neural activity, and a failure to suppress competing information arriving at the right ear may be involved in poor left-sided processing in children with behavioral left-ear dichotic deficits.
In an attempt to develop a more ecologically valid measure of speech understanding in a background of competing speech, we constructed a quasidichotic procedure based on the monitoring of continuous speech from loudspeakers placed directly to the listener's right and left sides. The listener responded to the presence of incongruous or anomalous words imbedded within the context of two children's fairy tales. Attention was directed either to the right or to the left side in blocks of 25 utterances. Within each block, there were target (anomalous) and nontarget (nonanomalous) words. Responses to target words were analyzed separately for attend-right and attend-left conditions. Our purpose was twofold: (1) to evaluate the feasibility of such an approach for obtaining electrophysiologic performance measures in the sound field and (2) to gather normative interaural symmetry data for the new technique in young adults with normal hearing. Event-related potentials to target and nontarget words at 30 electrode sites were obtained in 20 right-handed young adults with normal hearing. Waveforms and associated topographic maps were characterized by a slight negativity in the region of 400 msec (N400) and robust positivity in the region of 900 msec (P900). Norms for interaural symmetry of the P900 event-related potential in young adults were derived. Abbreviations: CVC = consonant-vowel-consonant, EEG = electroencephalic, ERP = event-related potential, ISI = interaural symmetry index, VEOG = vertical electro-oculography
Results are discussed in relation to auditory-specific outcomes on clinical tests for APD.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.