With the broader use of stereoscopic displays, a flurry of research activity about the accommodation-vergence conflict has emerged to highlight the implications for the human visual system. In stereoscopic displays, the introduction of binocular disparities requires the eyes to make vergence movements. In this study, we examined vergence dynamics with regard to the conflict between the stimulus-to-accommodation and the stimulus-to-vergence. In a first experiment, we evaluated the immediate effect of the conflict on vergence responses by presenting stimuli with conflicting disparity and focus on a stereoscopic display (i.e. increasing the stereoscopic demand) or by presenting stimuli with matched disparity and focus using an arrangement of displays and a beam splitter (i.e. focus and disparity specifying the same locations). We found that the dynamics of vergence responses were slower overall in the first case due to the conflict between accommodation and vergence. In a second experiment, we examined the effect of a prolonged exposure to the accommodation-vergence conflict on vergence responses, in which participants judged whether an oscillating depth pattern was in front or behind the fixation plane. An increase in peak velocity was observed, thereby suggesting that the vergence system has adapted to the stereoscopic demand. A slight increase in vergence latency was also observed, thus indicating a small decline of vergence performance. These findings offer a better understanding and document how the vergence system behaves in stereoscopic displays. We describe what stimuli in stereo-movies might produce these oculomotor effects, and discuss potential applications perspectives.
Viewing a scene on a screen display differs greatly from viewing it in the real world. The visual information is conveyed via a flat screen at a fixed distance, and this screen distance can influence how viewers perceive depth in stereograms in conventional stereoscopic displays. This study investigated whether screen distance influences perceived depth in Virtual Reality (VR) systems providing additional motion parallax information. Participants adjusted the depth of a vertical dihedron displayed as a random-dot stereogram. In a first experiment, the stimulus was presented either alone in a gray untextured background or in a cue-rich environment. We found that despite the extra motion parallax information in VR systems compared to conventional stereo-displays, physical screen distance still affected depth perception substantially at longer simulated distances. However, the effect lessened when observers were immersed in a rich and structured environment, possibly allowing them to use other depth cues. A second experiment assessed the influence of potentially potent display-related factors (resolution, display orientation, luminance non-uniformity, and specular reflection), as well as the effect of accommodation-vergence (A-V) conflict size. Depth perception was compared between a Head-Mounted Display (HMD) and an L-shaped system, and between a CAVE and an L-shaped system. These comparisons between CAVE-like VR systems and HMDs revealed that A-V conflict and inclusion of a rich environment were the major factors impacting depth perception. These results have practical and methodological implications for the reliable use of VR systems, especially where accurate depth-matching is involved.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with đź’™ for researchers
Part of the Research Solutions Family.