Research has shown the existence of perceptual and neural bias toward sounds perceived as sources approaching versus receding a listener. It has been suggested that a greater biological salience of approaching auditory sources may account for these effects. In addition, these effects may hold only for those sources critical for our survival. In the present study, we bring support to these hypotheses by quantifying the emotional responses to different sounds with changing intensity patterns. In 2 experiments, participants were exposed to artificial and natural sounds simulating approaching or receding sources. The auditory-induced emotional effect was reflected in the performance of participants in an emotion-related behavioral task, their self-reported emotional experience, and their physiology (electrodermal activity and facial electromyography). The results of this study suggest that approaching unpleasant sound sources evoke more intense emotional responses in listeners than receding ones, whereas such an effect of perceived sound motion does not exist for pleasant or neutral sound sources. The emotional significance attributed to the sound source itself, the loudness of the sound, and loudness change duration seem to be relevant factors in this disparity.
Research has demonstrated that two types of affect have an influence on judgment and decision making: incidental affect (affect unrelated to a judgment or decision such as a mood) and integral affect (affect that is part of the perceiver’s internal representation of the option or target under consideration). So far, these two lines of research have seldom crossed so that knowledge concerning their combined effects is largely missing. To fill this gap, the present review highlights differences and similarities between integral and incidental affect. Further, common and unique mechanisms that enable these two types of affect to influence judgment and choices are identified. Finally, some basic principles for affect integration when the two sources co-occur are outlined. These mechanisms are discussed in relation to existing work that has focused on incidental or integral affect but not both.
Humans receive a constant stream of input that potentially influence their affective experience. Despite intensive research on affect, it is still largely unknown how various sources of information are integrated into the single, unified affective features that accompany consciousness. Here, we aimed to investigate how a stream of evocative input we receive is dynamically represented in self-reported affect. In 4 experiments, participants viewed a number of sequentially presented images and reported their momentary affective experience on valence and arousal scales. The number and duration of images in a trial varied across studies. In Study 4, we also measured participants’ physiological responses while they viewed images. We formulated and compared several models with respect to their capacity to predict self-reported affect based on normative image ratings, physiological measurements, and prior affective experience (measured in the previous trial). Our data best supported a model incorporating a temporally sensitive averaging mechanism for affective integration that assigns higher weights to affectively more potent and recently represented stimuli. Crucially, affective averaging of sensory information and prior affect accounted for distinct contributions to currently experienced affect. Taken together, the current study provides evidence that prior affect and integrated affective impact of stimuli partly shape currently experienced affect.
Loudness perception is thought to be a modular system that is unaffected by other brain systems. We tested the hypothesis that loudness perception can be influenced by negative affect using a conditioning paradigm, where some auditory stimuli were paired with aversive experiences while others were not. We found that the same auditory stimulus was reported as being louder, more negative and fear-inducing when it was conditioned with an aversive experience, compared to when it was used as a control stimulus. This result provides support for an important role of emotion in auditory perception.
The ability to detect and localize sounds in an environment is critical for survival. Localizing sound sources is a computational challenge for the human brain because the auditory cortex seems to lack a topographical space representation. However, attention and task demands can modulate localization performance. Here, we investigated whether the localization performance for sounds occurring directly in front of or behind people could be modulated by emotional salience and sound-source location. We measured auditory-induced emotion by ecological sounds occurring in the frontal or rear perceptual fields, and employed a speeded localization task. The results showed that both localization speed and accuracy were higher, and that stronger negative emotions were induced when sound sources were behind the participants. Our results provide clear behavioral evidence that auditory attention can be influenced by sound-source location. Importantly, we also show that the effect of spatial location on attention is mediated by emotion, which is in line with the argument that emotional information is prioritized in processing. Auditory system functions as an alarm system and is in charge of detecting possible salient events, and alarming for an attention shift. Further, spatial processing in the auditory dorsal pathway has a function of guiding the visual system to a particular location of interest. Thus, an auditory bias toward the space outside the visual field can be useful, so that visual attention could be quickly shifted in case of emotionally significant information.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.