Human-environment interactions normally occur in the physical milieu and thus by medium of the body and within the space immediately adjacent to and surrounding the body, the peripersonal space (PPS). However, human interactions increasingly occur with or within virtual environments, and hence novel approaches and metrics must be developed to index human-environment interactions in virtual reality (VR). Here, we present a multisensory task that measures the spatial extent of human PPS in real, virtual, and augmented realities. We validated it in a mixed reality (MR) ecosystem in which real environment and virtual objects are blended together in order to administer and control visual, auditory, and tactile stimuli in ecologically valid conditions. Within this mixed-reality environment, participants are asked to respond as fast as possible to tactile stimuli on their body, while task-irrelevant visual or audiovisual stimuli approach their body. Results demonstrate that, in analogy with observations derived from monkey electrophysiology and in real environmental surroundings, tactile detection is enhanced when visual or auditory stimuli are close to the body, and not when far from it. We then calculate the location where this multisensory facilitation occurs as a proxy of the boundary of PPS. We observe that mapping of PPS via audiovisual, as opposed to visual alone, looming stimuli results in sigmoidal fits-allowing for the bifurcation between near and far space-with greater goodness of fit. In sum, our approach is able to capture the boundaries of PPS on a spatial continuum, at the individual-subject level, and within a fully controlled and previously laboratory-validated setup, while maintaining the richness and ecological validity of real-life events. The task can therefore be applied to study the properties of PPS in humans and to index the features governing human-environment interactions in virtual or MR. We propose PPS as an ecologically valid and neurophysiologically established metric in the study of the impact of VR and related technologies on society and individuals.
Vision is known to be shaped by context, defined by environmental and bodily signals. In the Taylor illusion, the size of an afterimage projected on one's hand changes according to proprioceptive signals conveying hand position. Here, we assessed whether the Taylor illusion does not just depend on the physical hand position, but also on bodily self-consciousness as quantified through illusory hand ownership. Relying on the somatic rubber hand illusion, we manipulated hand ownership, such that participants embodied a rubber hand placed next to their own hand. We found that an afterimage projected on the participant's hand drifted depending on illusory ownership between the participants' two hands, showing an implication of self-representation during the Taylor illusion. Oscillatory power analysis of electroencephalographic signals showed that illusory hand ownership was stronger in participants with stronger ␣ suppression over left sensorimotor cortex, whereas the Taylor illusion correlated with higher /␥ power over frontotemporal regions. Higher ␥ connectivity between left sensorimotor and inferior parietal cortex was also found during illusory hand ownership. These data show that afterimage drifts in the Taylor illusion do not only depend on the physical hand position but also on subjective ownership, which itself is based on the synchrony of somatosensory signals from the two hands. The effect of ownership on afterimage drifts is associated with /␥ power and ␥ connectivity between frontoparietal regions and the visual cortex. Together, our results suggest that visual percepts are not only influenced by bodily context but are self-grounded, mapped on a self-referential frame.
Previous evidence highlighted the multisensory‐motor origin of embodiment – that is, the experience of having a body and of being in control of it – and the possibility of experimentally manipulating it. For instance, an illusory feeling of embodiment towards a fake hand can be triggered by providing synchronous visuo‐tactile stimulation to the hand of participants and to a fake hand or by asking participants to move their hand and observe a fake hand moving accordingly (rubber hand illusion). Here, we tested whether it is possible to manipulate embodiment not through stimulation of the participant's hand, but by directly tapping into the brain's hand representation via non‐invasive brain stimulation. To this aim, we combined transcranial magnetic stimulation (TMS), to activate the hand corticospinal representation, with virtual reality (VR), to provide matching (as contrasted to non‐matching) visual feedback, mimicking involuntary hand movements evoked by TMS. We show that the illusory embodiment occurred when TMS pulses were temporally matched with VR feedback, but not when TMS was administered outside primary motor cortex, (over the vertex) or when stimulating motor cortex at a lower intensity (that did not activate peripheral muscles). Behavioural (questionnaires) and neurophysiological (motor‐evoked‐potentials, TMS‐evoked‐movements) measures further indicated that embodiment was not explained by stimulation per se, but depended on the temporal coherence between TMS‐induced activation of hand corticospinal representation and the virtual bodily feedback. This reveals that non‐invasive brain stimulation may replace the application of external tactile hand cues and motor components related to volition, planning and anticipation.
The majority of scientific studies on consciousness have focused on vision, exploring the cognitive and neural mechanisms of conscious access to visual stimuli. In parallel, studies on bodily consciousness have revealed that bodily (i.e. tactile, proprioceptive, visceral, vestibular) signals are the basis for the sense of self. However, the role of bodily signals in the formation of visual consciousness is not well understood. Here we investigated how body-related visuo-tactile stimulation modulates conscious access to visual stimuli. We used a robotic platform to apply controlled tactile stimulation to the participants' back while they viewed a dot moving either in synchrony or asynchrony with the touch on their back. Critically, the dot was rendered invisible through continuous flash suppression. Manipulating the visual context by presenting the dot moving on either a body form, or a non-bodily object we show that: (i) conflict induced by synchronous visuo-tactile stimulation in a body context is associated with a delayed conscious access compared to asynchronous visuo-tactile stimulation, (ii) this effect occurs only in the context of a visual body form, and (iii) is not due to detection or response biases. The results indicate that body-related visuo-tactile conflicts impact visual consciousness by facilitating access of non-conflicting visual information to awareness, and that these are sensitive to the visual context in which they are presented, highlighting the interplay between bodily signals and visual experience.
Vision is known to be shaped by context, defined by environmental and bodily signals. In the Taylor illusion, the size of an afterimage projected on one's hand changes according to proprioceptive signals conveying hand position. Here, we assessed whether the Taylor illusion does not just depend on the physical hand position, but also on bodily self-consciousness as quantified through illusory hand ownership. Relying on the somatic rubber hand illusion, we manipulated hand ownership, such that participants embodied a rubber hand placed next to their own hand. We found that an afterimage projected on the participant's hand drifted depending on illusory ownership between the participants' two hands, showing an implication of self-representation during the Taylor illusion. Oscillatory power analysis of electroencephalographic signals showed that illusory hand ownership was stronger in participants with stronger ␣ suppression over left sensorimotor cortex, whereas the Taylor illusion correlated with higher /␥ power over frontotemporal regions. Higher ␥ connectivity between left sensorimotor and inferior parietal cortex was also found during illusory hand ownership. These data show that afterimage drifts in the Taylor illusion do not only depend on the physical hand position but also on subjective ownership, which itself is based on the synchrony of somatosensory signals from the two hands. The effect of ownership on afterimage drifts is associated with /␥ power and ␥ connectivity between frontoparietal regions and the visual cortex. Together, our results suggest that visual percepts are not only influenced by bodily context but are self-grounded, mapped on a self-referential frame.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.