There is a small but growing literature on the perception of natural acoustic events, but few attempts have been made to investigate complex sounds not systematically controlled within a laboratory setting. The present study investigates listeners' ability to make judgments about the posture (upright-stooped) of the walker who generated acoustic stimuli contrasted on each trial. We use a comprehensive three-stage approach to event perception, in which we develop a solid understanding of the source event and its sound properties, as well as the relationships between these two event stages. Developing this understanding helps both to identify the limitations of common statistical procedures and to develop effective new procedures for investigating not only the two information stages above, but also the decision strategies employed by listeners in making source judgments from sound. The result is a comprehensive, ultimately logical, but not necessarily expected picture of both the source-sound-perception loop and the utility of alternative research tools.
The effects of variation from stimulus to stimulus in emotional tone of voice on speech perception were examined through a series of perceptual experiments. Stimuli were recorded from human speakers who produced utterances in tones of voice designed to convey affective information. Then, stimuli varying in talker voice and emotional tone were presented to listeners for perceptual matching and classification. The results showed that both intertalker variation in talker voice and intratalker variation in emotional tone had a negative effect on perceptual performance. The results suggest that sources of variation in the speech signal that affect the spectral/temporal properties of speech (i.e., talker voice, speech rate, emotional tone) may be treated differently than sources of variation that do not affect these properties (i.e., vocal amplitude).
Change deafness, the auditory analog to change blindness, occurs when salient, and behaviorally relevant changes to sound sources are missed. Missing significant changes in the environment can have serious consequences, however, this effect, has remained little more than a lab phenomenon and a party trick. It is only recently that researchers have begun to explore the nature of these profound errors in change perception. Despite a wealth of examples of the change blindness phenomenon, work on change deafness remains fairly limited. The purpose of the current paper is to review the state of the literature on change deafness and propose an explanation of change deafness that relies on factors related to stimulus information rather than attentional or memory limits. To achieve this, work on across several auditory research domains, including environmental sound classification, informational masking, and change deafness are synthesized to present a unified perspective on the perception of change errors in complex, dynamic sound environments. We hope to extend previous research by describing how it may be possible to predict specific patters of change perception errors based on varying degrees of similarity in stimulus features and uncertainty about which stimuli and features are important for a given perceptual decision.
Cross-modal interactions between sensory channels have been shown to depend on both the spatial disparity and the perceptual similarity between the presented stimuli. Here we investigate the behavioral and neural integration of auditory and tactile stimulus pairs at different levels of spatial disparity. Additionally, we modulated the amplitudes of both stimuli in either a coherent or non-coherent manner. We found that both auditory and tactile localization performance was biased towards the stimulus in the respective other modality. This bias linearly increases with stimulus disparity and is more pronounced for coherently modulated stimulus pairs. Analyses of electroencephalographic (EEG) activity at temporal–cortical sources revealed enhanced event-related potentials (ERPs) as well as decreased alpha and beta power during bimodal as compared to unimodal stimulation. However, while the observed ERP differences are similar for all stimulus combinations, the extent of oscillatory desynchronization varies with stimulus disparity. Moreover, when both stimuli were subjectively perceived as originating from the same direction, the reduction in alpha and beta power was significantly stronger. These observations suggest that in the EEG the level of perceptual integration is mainly reflected by changes in ongoing oscillatory activity.
Important decisions in the heat of battle occur rapidly and a key aptitude of a good combat soldier is the ability to determine whether he is under fire. This rapid decision requires the soldier to make a judgment in a fraction of a second, based on a barrage of multisensory cues coming from multiple modalities. The present study uses an oddball paradigm to examine listener ability to differentiate shooter locations from audio recordings of small arms fire. More importantly, we address the neural correlates involved in this rapid decision process by employing single-trial analysis of electroencephalography (EEG). In particular, we examine small arms expert listeners as they differentiate the sounds of small arms firing events recorded at different observer positions relative to a shooter. Using signal detection theory, we find clear neural signatures related to shooter firing angle by identifying the times of neural discrimination on a trial-to-trial basis. Similar to previous results in oddball experiments, we find common windows relative to the response and the stimulus when neural activity discriminates between target stimuli (forward fire: observer 0° to firing angle) vs. standards (off-axis fire: observer 90° to firing angle). We also find, using windows of maximum discrimination, that auditory target vs. standard discrimination yields neural sources in Brodmann Area 19 (BA 19), i.e., in the visual cortex. In summary, we show that single-trial analysis of EEG yields informative scalp distributions and source current localization of discriminating activity when the small arms experts discriminate between forward and off-axis fire observer positions. Furthermore, this perceptual decision implicates brain regions involved in visual processing, even though the task is purely auditory. Finally, we utilize these techniques to quantify the level of expertise in these subjects for the chosen task, having implications for human performance monitoring in combat.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.