Electrophysiological correlates of the processing facial expressions were investigated in subjects performing the rapid serial visual presentation (RSVP) task. The peak latencies of the event-related potential (ERP) components P1, vertex positive potential (VPP), and N170 were 165, 240 and 240 ms, respectively. The early anterior N100 and posterior P1 amplitudes elicited by fearful faces were larger than those elicited by happy or neutral faces, a finding which is consistent with the presence of a ‘negativity bias’. The amplitude of the anterior VPP was larger when subjects were processing fearful and happy faces than when they were processing neutral faces; it was similar in response to fearful and happy faces. The late N300 and P300 not only distinguished emotional faces from neutral faces but also differentiated between fearful and happy expressions in lag2. The amplitudes of the N100, VPP, N170, N300, and P300 components and the latency of the P1 component were modulated by attentional resources. Deficient attentional resources resulted in decreased amplitude and increased latency of ERP components. In light of these results, we present a hypothetical model involving three stages of facial expression processing.
Sudden changes in the acoustic environment enhance perceptual processing of subsequent visual stimuli that appear in close spatial proximity. Little is known, however, about the neural mechanisms by which salient sounds affect visual processing. In particular, it is unclear whether such sounds automatically activate visual cortex. To shed light on this issue, the present study examined event-related brain potentials (ERPs) that were triggered either by peripheral sounds that preceded task-relevant visual targets (Experiment 1) or were presented during purely auditory tasks (Experiments 2, 3, and 4). In all experiments the sounds elicited a contralateral ERP over the occipital scalp that was localized to neural generators in extrastriate visual cortex of the ventral occipital lobe. The amplitude of this cross-modal ERP was predictive of perceptual judgments about the contrast of co-localized visual targets. These findings demonstrate that sudden, intrusive sounds reflexively activate human visual cortex in a spatially specific manner, even during purely auditory tasks when the sounds are not relevant to the ongoing task.
Directing attention voluntarily to the location of a visual target results in an amplitude reduction (desynchronization) of the occipital alpha rhythm (8-14Hz), which is predictive of improved perceptual processing of the target. Here we investigated whether modulations of the occipital alpha rhythm triggered by the involuntary orienting of attention to a salient but spatially non-predictive sound would similarly influence perception of a subsequent visual target. Target discrimination was more accurate when a sound preceded the target at the same location (validly cued trials) than when the sound was on the side opposite to the target (invalidly cued trials). This behavioral effect was accompanied by a sound-induced desynchronization of the alpha rhythm over the lateral occipital scalp. The magnitude of alpha desynchronization over the hemisphere contralateral to the sound predicted correct discriminations of validly cued targets but not of invalidly cued targets. These results support the conclusion that cue-induced alpha desynchronization over the occipital cortex is a manifestation of a general priming mechanism that improves visual processing and that this mechanism can be activated either by the voluntary or involuntary orienting of attention. Further, the observed pattern of alpha modulations preceding correct and incorrect discriminations of valid and invalid targets suggests that involuntary orienting to the non-predictive sound has a rapid and purely facilitatory influence on processing targets on the cued side, with no inhibitory influence on targets on the opposite side.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.