2022
DOI: 10.1111/tgis.12914
|View full text |Cite
|
Sign up to set email alerts
|

A gaze‐based interaction method for large‐scale and large‐space disaster scenes within mobile virtual reality

Abstract: A three‐dimensional (3D) visualization of disaster scenes based on mobile virtual reality (VR) can improve the application scenarios and emergency service capabilities of traditional 3D visualization of disaster scenes. Because a smartphone needs to be placed into a mobile head‐mounted display, conventional touch scene interaction cannot be used by mobile VR, and the user's gaze usually serves as the default scene interaction method. However, the existing gaze‐based interaction methods for mobile VR scenes are… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(2 citation statements)
references
References 50 publications
0
2
0
Order By: Relevance
“…Eye tracking has been used as an input modality for decades [17,18]. There is increasing interest in gaze interaction, especially in virtual/augmented reality (VR/AR) [19][20][21][22][23], partly due to improvements in tracking accuracy, portability and affordability. One critical issue in gaze interaction is avoiding unintended selection (i.e., a user looking at an object does not mean he or she wants to interact with it), which is referred to as the Midas Touch problem [17].…”
Section: Eye Tracking For Human-computer Interactionmentioning
confidence: 99%
“…Eye tracking has been used as an input modality for decades [17,18]. There is increasing interest in gaze interaction, especially in virtual/augmented reality (VR/AR) [19][20][21][22][23], partly due to improvements in tracking accuracy, portability and affordability. One critical issue in gaze interaction is avoiding unintended selection (i.e., a user looking at an object does not mean he or she wants to interact with it), which is referred to as the Midas Touch problem [17].…”
Section: Eye Tracking For Human-computer Interactionmentioning
confidence: 99%
“…People could interact with the virtual space through gaze interaction, gesture interaction, interactive devices, etc. [68]. At the same time, people's movement and proprioception are closely linked, and the perception of self and environment is enhanced, arising from the presence and positive emotions [69].…”
Section: Introductionmentioning
confidence: 99%