Incorporating the fact that the senses are embodied is necessary for an organism to interpret sensory information. Before a unified perception of the world can be formed, sensory signals must be processed with reference to body representation. The various attributes of the body such as shape, proportion, posture, and movement can be both derived from the various sensory systems and can affect perception of the world (including the body itself). In this review we examine the relationships between sensory and motor information, body representations, and perceptions of the world and the body. We provide several examples of how the body affects perception (including but not limited to body perception). First we show that body orientation effects visual distance perception and object orientation. Also, visual-auditory crossmodal-correspondences depend on the orientation of the body: audio “high” frequencies correspond to a visual “up” defined by both gravity and body coordinates. Next, we show that perceived locations of touch is affected by the orientation of the head and eyes on the body, suggesting a visual component to coding body locations. Additionally, the reference-frame used for coding touch locations seems to depend on whether gaze is static or moved relative to the body during the tactile task. The perceived attributes of the body such as body size, affect tactile perception even at the level of detection thresholds and two-point discrimination. Next, long-range tactile masking provides clues to the posture of the body in a canonical body schema. Finally, ownership of seen body parts depends on the orientation and perspective of the body part in view. Together, all of these findings demonstrate how sensory and motor information, body representations, and perceptions (of the body and the world) are interdependent.
The position of gaze (eye plus head position) relative to body is known to alter the perceived locations of sensory targets. This effect suggests that perceptual space is at least partially coded in a gaze-centered reference frame. However, the direction of the effects reported has not been consistent. Here, we investigate the cause of a discrepancy between reported directions of shift in tactile localization related to head position. We demonstrate that head eccentricity can cause errors in touch localization in either the same or opposite direction as the head is turned depending on the procedure used. When head position is held eccentric during both the presentation of a touch and the response, there is a shift in the direction opposite to the head. When the head is returned to center before reporting, the shift is in the same direction as head eccentricity. We rule out a number of possible explanations for the difference and conclude that when the head is moved between a touch and response the touch is coded in a predominantly gaze-centered reference frame, whereas when the head remains stationary a predominantly body-centered reference frame is used. The mechanism underlying these displacements in perceived location is proposed to involve an underestimated gaze signal. We propose a model demonstrating how this single neural error could cause localization errors in either direction depending on whether the gaze or body midline is used as a reference. This model may be useful in explaining gaze-related localization errors in other modalities.
Low- and high-pitched sounds are perceptually associated with low and high visuospatial elevations, respectively. The spatial properties of this association are not well understood. Here we report two experiments that investigated whether low and high tones can be used as spatial cues to upright for self-orientation and identified the spatial frame(s) of reference used in perceptually binding auditory pitch to visuospatial ‘up’ and ‘down’. In experiment 1, participants’ perceptual upright (PU) was measured while lying on their right side with and without high- and low-pitched sounds played through speakers above their left ear and below their right ear. The sounds were ineffective in moving the perceived upright from a direction intermediate between the body and gravity towards the direction indicated by the sounds. In experiment 2, we measured the biasing effects of ascending and descending tones played through headphones on ambiguous vertical or horizontal visual motion created by combining gratings drifting in opposite directions while participants either sat upright or laid on their right side. Ascending and descending tones biased the interpretation of ambiguous motion along both the gravitational vertical and the long-axis of the body with the strongest effect along the body axis. The combination of these two effects showed that axis of maximum effect of sound corresponded approximately to the direction of the perceptual upright, compatible with the idea that ‘high’ and ‘low’ sounds are defined along this axis.
The perceived direction of up depends on visual, gravity, and body cues, each of which is given a weighting by the brain (Dyde et al., 2006). Little work has been done, however, to demonstrate whether sound might also be usable by the brain as a cue to up. Here we assess the possible contribution of sound to perceived orientation by adding a sound cue to gravity. The perceptual upright, the direction in which a character is most easily recognized, was assessed using the Oriented Character Recognition Test (OCHART). Subjects identified the character ‘p’ that was presented in various orientations (0–360 degrees rotation) as either a ‘p’ or ‘d’. The orientations were chosen by a QUEST adaptive staircase procedure and the mean of the points of subjective equality was taken as the perceptual upright. Subjects lay on their side and viewed a laptop screen through a shroud. Thus, body and gravity cues were orthogonal and swings of the perceptual upright towards or away from gravity could be measured. Loudspeakers were mounted above and below the lying subject (opposite the left and right ears) and sounds were presented synchronized to the appearance of the character on the screen. Changes in the direction of the PU were recorded depending on whether a sound was present or not. We conclude that sounds can contribute to the perception of upright.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.