Many behavioral measures of visual perception fluctuate continually in a rhythmic manner, reflecting the influence of endogenous brain oscillations, particularly theta (∼4-7 Hz) and alpha (∼8-12 Hz) rhythms [1-3]. However, it is unclear whether these oscillations are unique to vision or whether auditory performance also oscillates [4, 5]. Several studies report no oscillatory modulation in audition [6, 7], while those with positive findings suffer from confounds relating to neural entrainment [8-10]. Here, we used a bilateral pitch-identification task to investigate rhythmic fluctuations in auditory performance separately for the two ears and applied signal detection theory (SDT) to test for oscillations of both sensitivity and criterion (changes in decision boundary) [11, 12]. Using uncorrelated dichotic white noise to induce a phase reset of oscillations, we demonstrate that, as with vision, both auditory sensitivity and criterion showed strong oscillations over time, at different frequencies: ∼6 Hz (theta range) for sensitivity and ∼8 Hz (low alpha range) for criterion, implying distinct underlying sampling mechanisms [13]. The modulation in sensitivity in left and right ears was in antiphase, suggestive of attention-like mechanisms sampling alternatively from the two ears.
Recent findings on multisensory integration suggest that selective attention influences cross-sensory interactions from an early processing stage. Yet, in the field of emotional face-voice integration, the hypothesis prevails that facial and vocal emotional information interacts preattentively. Using ERPs, we investigated the influence of selective attention on the perception of congruent versus incongruent combinations of neutral and angry facial and vocal expressions. Attention was manipulated via four tasks that directed participants to (i) the facial expression, (ii) the vocal expression, (iii) the emotional congruence between the face and the voice, and (iv) the synchrony between lip movement and speech onset. Our results revealed early interactions between facial and vocal emotional expressions, manifested as modulations of the auditory N1 and P2 amplitude by incongruent emotional face-voice combinations. Although audiovisual emotional interactions within the N1 time window were affected by the attentional manipulations, interactions within the P2 modulation showed no such attentional influence. Thus, we propose that the N1 and P2 are functionally dissociated in terms of emotional face-voice processing and discuss evidence in support of the notion that the N1 is associated with cross-sensory prediction, whereas the P2 relates to the derivation of an emotional percept. Essentially, our findings put the integration of facial and vocal emotional expressions into a new perspective-one that regards the integration process as a composite of multiple, possibly independent subprocesses, some of which are susceptible to attentional modulation, whereas others may be influenced by additional factors.
Highlights d We demonstrate the role of alpha rhythms in the propagation of perceptual history d Auditory decisions were rhythmically biased by stimuli presented 1 or 2 trials back d Bias oscillated at 9 Hz only when successive stimuli occurred in the same ear d Alpha is strongly implicated in predictive perception and working memory formation
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.