People interested in the research are advised to contact the author for the final version of the publication, or visit the DOI to the publisher's website.• The final author version and the galley proof are versions of the publication after peer review.• The final published version features the final layout of the paper including the volume, issue and page numbers. Link to publication General rightsCopyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights.• Users may download and print one copy of any publication from the public portal for the purpose of private study or research. • You may not further distribute the material or use it for any profit-making activity or commercial gain • You may freely distribute the URL identifying the publication in the public portal.If the publication is distributed under the terms of Article 25fa of the Dutch Copyright Act, indicated by the "Taverne" license above, please follow below link for the End User Agreement:
Background: Sensory suppression occurs when hearing ones self-generated voice, as opposed to passively listening to ones own voice. Quality changes of sensory feedback to the self-generated voice can increase attentional control. These changes affect the self-other voice distinction and might lead to hearing non-existent voices in the absence of an external source (i.e., auditory verbal hallucinations (AVH)). However, it is unclear how changes in sensory feedback processing and attention allocation interact and how this interaction might relate to hallucination proneness (HP). Study Design: Participants varying in HP self-generated and passively listened to their voice that varied in emotional quality and certainty of recognition - 100% neutral, 60-40% neutral-angry, 50-50% neutral-angry, 40-60% neutral-angry, 100% angry, during EEG recordings. Study Results: The N1 auditory evoked potential was more suppressed for the self-generated than externally generated voices. Increased HP was associated with (i) an increased N1 response to the self- compared to externally generated voices, (ii) a reduced N1 response for angry compared to neutral voices, and (iii) a reduced N2 response to unexpected voice quality in sensory feedback (60-40% neutral-angry) compared to neutral voices. Conclusions: The current study highlights an association between increased HP and systematic changes of the emotional quality and certainty in sensory feedback processing (N1) and attentional control (N2) in self-voice production in a non-clinical population. Considering that voice hearers also display these changes, these findings support the continuum hypothesis. However, additional research is needed to validate this conclusion.
Appraisals can be influenced by cultural beliefs and stereotypes. In line with this, past research has shown that judgments about the emotional expression of a face are influenced by the face’s sex, and vice versa that judgments about the sex of a person somewhat depend on the person’s facial expression. For example, participants associate anger with male faces, and female faces with happiness or sadness. However, the strength and the bidirectionality of these effects remain debated. Moreover, the interplay of a stimulus’ emotion and sex remains mostly unknown in the auditory domain. To investigate these questions, we created a novel stimulus set of 121 avatar faces and 121 human voices (available at https://bit.ly/2JkXrpy) with matched, fine-scale changes along the emotional (happy to angry) and sexual (male to female) dimensions. In a first experiment (N = 76), we found clear evidence for the mutual influence of facial emotion and sex cues on ratings, and moreover for larger implicit (task-irrelevant) effects of stimulus’ emotion than of sex. These findings were replicated and extended in two preregistered studies—one laboratory categorization study using the same face stimuli (N = 108; https://osf.io/ve9an), and one online study with vocalizations (N = 72; https://osf.io/vhc9g). Overall, results show that the associations of maleness-anger and femaleness-happiness exist across sensory modalities, and suggest that emotions expressed in the face and voice cannot be entirely disregarded, even when attention is mainly focused on determining stimulus’ sex. We discuss the relevance of these findings for cognitive and neural models of face and voice processing.
Stimuli that evoke emotions are salient, draw attentional resources, and facilitate situationally appropriate behavior in complex or conflicting environments. However, negative and positive emotions may motivate different response strategies. For example, a threatening stimulus might evoke avoidant behavior, whereas a positive stimulus may prompt approaching behavior. Therefore, emotional stimuli might either elicit differential behavioral responses when a conflict arises or simply mark salience. The present study used functional magnetic resonance imaging to investigate valence-specific emotion effects on attentional control in conflict processing by employing an adapted flanker task with neutral, negative, and positive stimuli. Slower responses were observed for incongruent than congruent trials. Neural activity in the dorsal anterior cingulate cortex was associated with conflict processing regardless of emotional stimulus quality. These findings confirm that both negative and positive emotional stimuli mark salience in both low (congruent) and high (incongruent) conflict scenarios. Regardless of the conflict level, emotional stimuli deployed greater attentional resources in goal directed behavior.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.