To perceive the auditory and visual aspects of a physical event as occurring simultaneously, the brain must adjust for differences between the two modalities in both physical transmission time and sensory processing time. One possible strategy to overcome this difficulty is to adaptively recalibrate the simultaneity point from daily experience of audiovisual events. Here we report that after exposure to a fixed audiovisual time lag for several minutes, human participants showed shifts in their subjective simultaneity responses toward that particular lag. This 'lag adaptation' also altered the temporal tuning of an auditory-induced visual illusion, suggesting that adaptation occurred via changes in sensory processing, rather than as a result of a cognitive shift while making task responses. Our findings suggest that the brain attempts to adjust subjective simultaneity across different modalities by detecting and reducing time lags between inputs that likely arise from the same physical events.
To see whether there is a difference in temporal resolution of synchrony perception between audio-visual (AV), visuo-tactile (VT), and audio-tactile (AT) combinations, we compared synchrony-asynchrony discrimination thresholds of human participants. Visual and auditory stimuli were, respectively, a luminance-modulated Gaussian blob and an amplitude-modulated white noise. Tactile stimuli were mechanical vibrations presented to the index finger. All the stimuli were temporally modulated by either single pulses or repetitive-pulse trains. The results show that the temporal resolution of synchrony perception was similar for AV and VT (e.g., approximately 4 Hz for repetitive-pulse stimuli), but significantly higher for AT approximately 10 Hz). Apart from having a higher temporal resolution, however, AT synchrony perception was similar to AV synchrony perception in that participants could select matching features through attention, and a change in the matching-feature attribute had little effect on temporal resolution. The AT superiority in temporal resolution was indicated not only by synchrony-asynchrony discrimination but also by simultaneity judgments. Temporal order judgments were less affected by modality combination than the other two tasks.
Temporal synchrony is a critical condition for integrating information presented in different sensory modalities. To gain insight into the mechanism underlying synchrony perception of audio-visual signals we examined temporal limits for human participants to detect synchronous audio-visual stimuli. Specifically, we measured the percentage correctness of synchrony-asynchrony discrimination as a function of audio-visual lag while changing the temporal frequency and/or modulation waveforms. Audio-visual stimuli were a luminance-modulated Gaussian blob and amplitude-modulated white noise. The results indicated that synchrony-asynchrony discrimination became nearly impossible for periodic pulse trains at temporal frequencies higher than 4 Hz, even when the lag was large enough for discrimination with single pulses (Experiment 1). This temporal limitation cannot be ascribed to peripheral low-pass filters in either vision or audition (Experiment 2), which suggests that the temporal limit reflects a property of a more central mechanism located at or before cross-modal signal comparison. We also found that the functional behaviour of this central mechanism could not be approximated by a linear low-pass filter (Experiment 3). These results are consistent with a hypothesis that the perception of audio-visual synchrony is based on comparison of salient temporal features individuated from within-modal signal streams.
Most research on the multimodal perception of material properties has investigated the perception of material properties of two modalities such as vision-touch, vision-audition, audition-touch, and vision-action. Here, we investigated whether the same affective classifications of materials can be found in three different modalities of vision, audition, and touch, using wood as the target object. Fifty participants took part in an experiment involving the three modalities of vision, audition, and touch, in isolation. Twenty-two different wood types including genuine, processed, and fake were perceptually evaluated using a questionnaire consisting of twenty-three items (12 perceptual and 11 affective). The results demonstrated that evaluations of the affective properties of wood were similar in all three modalities. The elements of "expensiveness, sturdiness, rareness, interestingness, and sophisticatedness" and "pleasantness, relaxed feelings, and liked-disliked" were separately grouped for all three senses. Our results suggest that the affective material properties of wood are at least partly represented in a supramodal fashion. Our results also suggest an association between perceptual and affective properties, which will be a useful tool not only in science, but also in applied fields.
We examined whether the detection of audio-visual temporal synchrony is determined by a pre-attentive parallel process, or by an attentive serial process using a visual search paradigm. We found that detection of a visual target that changed in synchrony with an auditory stimulus was gradually impaired as the number of unsynchronized visual distractors increased (experiment 1), whereas synchrony discrimination of an attended target in a pre-cued location was unaffected by the presence of distractors (experiment 2). The effect of distractors cannot be ascribed to reduced target visibility nor can the increase in false alarm rates be predicted by a noisy parallel processing model. Reaction times for target detection increased linearly with number of distractors, with the slope being about twice as steep for target-absent trials as for targetpresent trials (experiment 3). Similar results were obtained regardless of whether the audio-visual stimulus consisted of visual flashes synchronized with amplitude-modulated pips, or of visual rotations synchronized with frequency-modulated up-down sweeps. All of the results indicate that audio-visual perceptual synchrony is judged by a serial process and are consistent with the suggestion that audio-visual temporal synchrony is detected by a 'mid-level' feature matching process.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.