We investigated postural responses (head displacements) and self-motion perception (vection) to radial and lateral optic flows while sitting and standing by using a head-mounted display. We found that head displacement directions varied across postures. In the standing posture, radial optic flow generally produced the opposed head displacement against the perceived vection direction, consistent with the literature; however, in the sitting posture, the optic flow generally produced the following head displacement in the vection direction. In the standing posture, responses were evident soon after the onset of the optic flow presentation but became less clear in the latter half of a trial. The results, while less clear for lateral flows, were similar for both flow types. Our findings suggest partially distinct processes underlying vection and postural control.
Optic flow that simulates self-motion often produces postural adjustment. Although literature has suggested that human postural control depends largely on visual inputs from the lower field in the environment, effects of the vertical location of optic flow on postural responses are not well investigated. Here, we examined whether optic flow presented in the lower visual field produces stronger responses than optic flow in the upper visual field. Either expanding or contracting optic flow was presented in upper, lower, or full visual fields through an Oculus Rift head-mounted display. Head displacement and vection strength were measured. Results showed larger head displacement under the optic flow presentation in the full visual field and the lower visual field than the upper visual field, during early period of presentation of the contracting optic flow. Vection was strongest in the full visual field and weakest in the upper visual field. Our findings of lower field superiority in head displacement and vection support the notion that ecologically relevant information has a particularly important role in human postural control and self-motion perception.
Humans perceive self-motion using multisensory information, while vision has a dominant role as is utilized in virtual reality (VR) technologies. Previous studies reported that visual motion presented in the lower visual field (LoVF) induces stronger illusion of self-motion (vection) as compared with the upper visual field (UVF). However, it was still unknown whether the LoVF superiority in vection was based on the retinotopic frame, or rather related to the environmental frame of reference. Here, we investigated the influences of retinotopic and environmental frames on the LoVF superiority of vection. We presented a planer surface along the depth axis in one of four visual fields (upper, lower, right, or left). The texture on the surface moved forward or backward. Participants reported vection while observing the visual stimulus through a VR head mounted display (HMD) in the sitting posture or lateral recumbent position. Results showed that the visual motion induced stronger vection when presented in the LoVF compared with the UVF in both postures. Notably, the vection rating in LoVF was stronger in the sitting than in the recumbent. Moreover, recumbent participants reported stronger vection when the stimulus was presented in the gravitationally lower field than in the gravitationally upper field. These results demonstrate contribution of multiple spatial frames on self-motion perception and imply the importance of ground surface.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.