a b s t r a c tRecent research on bodily self-consciousness has assumed that it consists of three distinct components: the experience of owning a body (body ownership); the experience of being a body with a given location within the environment (self-location); and the experience of taking a first-person, body-centered, perspective on that environment (perspective). Here we review recent neuroimaging studies suggesting that at least two of these components-body ownership and self-location-are implemented in rather distinct neural substrates, located, respectively, in the premotor cortex and in the temporo-parietal junction. We examine these results and consider them in relation to clinical evidence from patients with altered body perception and work on a variety of multisensory, body-related illusions, such as the rubber hand illusion, the full body illusion, the body swap illusion and the enfacement illusion. We conclude by providing a preliminary synthesis of the data on bodily self-consciousness and its neural correlates.
Mental body-representations are highly plastic and can be modified after brief exposure to unexpected sensory feedback. While the role of vision, touch and proprioception in shaping body-representations has been highlighted by many studies, the auditory influences on mental body-representations remain poorly understood. Changes in body-representations by the manipulation of natural sounds produced when one’s body impacts on surfaces have recently been evidenced. But will these changes also occur with non-naturalistic sounds, which provide no information about the impact produced by or on the body? Drawing on the well-documented capacity of dynamic changes in pitch to elicit impressions of motion along the vertical plane and of changes in object size, we asked participants to pull on their right index fingertip with their left hand while they were presented with brief sounds of rising, falling or constant pitches, and in the absence of visual information of their hands. Results show an “auditory Pinocchio” effect, with participants feeling and estimating their finger to be longer after the rising pitch condition. These results provide the first evidence that sounds that are not indicative of veridical movement, such as non-naturalistic sounds, can induce a Pinocchio-like change in body-representation when arbitrarily paired with a bodily action.
Perception of affordance is enhanced not only when that object is located in one's own peripersonal space, as compared to when it is located within extrapersonal space, but also when the object is located in another person's peripersonal space (as measured by a Spatial Alignment Effect (SAE)). It has been suggested that this reflects the existence of an Interpersonal Body Representation (IBR) that allows us to represent the perceptual states and action possibilities of others. Here, we address the question of whether IBR can be modulated by higher-level/reflective social cognition, such as judgments about one's own social status. Participants responded with either the right or the left hand as soon as a go signal appeared. The go signal screen contained a task-irrelevant stimulus consisting of a 3D scene in which a mug with a left-facing or right-facing handle was positioned on a table. The mug was positioned either inside or outside the reaching space of the participants. In a third of the trials, the mug was positioned within the reaching space of an avatar seated at the table. Prior to this task we induced an experience of social ostracism in half of the participants by means of a standardized social exclusion condition. The results were that the SAE that normally occurs when the mug is in the avatar's reaching space is extinguished by the induced social exclusion. This indicates that judgments about one's own social status modulate the effect of IBR.
The Bayesian model of multisensory cue integration proposed by Ernst and Banks [2002] provides an attractive model for understanding a way that our sensory systems may interact. Moreover, it has been suggested that the process of multisensory integration that it models underpins conscious experiences with multisensory representational contents merged across modalities (de Vignemont [2014b]). Should we therefore take empirical support for the Bayesian model as evidence of the multimodality of perception? Focusing on evidence of integration across vision and touch, I argue that apparent support for the model does not warrant the rejection of the view that each of our conscious perceptual experiences is associated with one and only one sense modality.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.