2019
DOI: 10.1049/iet-ipr.2018.5920
|View full text |Cite
|
Sign up to set email alerts
|

Survey on depth perception in head mounted displays: distance estimation in virtual reality, augmented reality, and mixed reality

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
46
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 101 publications
(48 citation statements)
references
References 44 publications
1
46
0
1
Order By: Relevance
“…2), the data from the scenarios with diffuse light conditions (1-Real-Diffuse, 2-AR-Smooth-Diffuse, 3-AR-Poly-Diffuse) show farther depth measures, about 10 cm of difference regarding spot light scenarios (z:0.20 for blue points vs 0.10 for orange ones). Such results are coherent with the literature, since it is mentioned that observers tend to overestimate distance in the personal space [9]. About the texture, it seems that the polystyrene one has a positive effect when combined to the spot lights, since scenario 6 has the more precise measures here.…”
Section: Discussionsupporting
confidence: 90%
See 1 more Smart Citation
“…2), the data from the scenarios with diffuse light conditions (1-Real-Diffuse, 2-AR-Smooth-Diffuse, 3-AR-Poly-Diffuse) show farther depth measures, about 10 cm of difference regarding spot light scenarios (z:0.20 for blue points vs 0.10 for orange ones). Such results are coherent with the literature, since it is mentioned that observers tend to overestimate distance in the personal space [9]. About the texture, it seems that the polystyrene one has a positive effect when combined to the spot lights, since scenario 6 has the more precise measures here.…”
Section: Discussionsupporting
confidence: 90%
“…This could be a consequence of the small field of view in the current devices. Moreover, in regards of depth perception in mixed reality, observers tend to underestimate objects' distance in the action and distant space, while they overestimate it for objects in the personal space [9]. In the absence of visual aids, humans tend to estimate distance to objects with an egocentric distance of 0.9 ± 0.2m [12].…”
Section: Related Workmentioning
confidence: 99%
“…It is an environment that mixes the aspects of both technologies, unifying the experience to require a single device. Thereby, MR merges both concepts to allow the user to interact with real objects within a virtual world, to be totally immersed in a completely virtual world or to reproduce virtual elements in the real environment [67,68]. Although MR has appeared relatively recently, certain companies with technological value are supporting its development, with the aim of democratising its use [69].…”
Section: Theoretical and Conceptual Frameworkmentioning
confidence: 99%
“…Headsets and other stereo-enabled displays affect depth perception [49,52], as people tend to underestimate depths in VR and AR. A survey on positioning in mixed reality [53] listed the types of tasks used to analyze depth perception in VR and AR: depth estimate verbal reporting, eyebody coordination tests such as walking or remote object movement via joysticks, and object-depth interaction such as picking, placing, and throwing. The survey also noted the need for research on whether depth perception allows for better localization in real-world scenes shown through AR and VR headsets.…”
Section: Depth Perception In Virtual Augmented Environmentsmentioning
confidence: 99%
“…In each test run, the participants were asked to navigate and accurately touch highlighted targets in the test area, using the remote control excavator's arm. This is a combination of the locomotion and pick-and-place tasks commonly used for depth estimation and positioning assessment [53,[69][70][71]76], adapted to the context of remotely controlling industrial machinery. The targets were highlighted in random order, and participants were scored on the number of targets reached.…”
Section: Experiments Designmentioning
confidence: 99%