2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) 2020
DOI: 10.1109/vr46266.2020.00026
|View full text |Cite
|
Sign up to set email alerts
|

A Comparison of Visual Attention Guiding Approaches for 360° Image-Based VR Tours

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
15
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 29 publications
(17 citation statements)
references
References 0 publications
2
15
0
Order By: Relevance
“…In particular, Haptic Feedback Luminance Modulation [4] PulseLight and PulseVibe [16] VRHapticDrones [7] FacePush [1] GazeRecall [12] Visual Stimuli [2] Diegetic Cues [13] Scaptics [10] Directing vs. Attracting [14] Arrow, Butterfly, Radar [18] Attention Funnel [17] Rubber Band [15] This work and Temporal Luminance Modulation are generally considered to be AG methods that are not distracting for the user while navigating in a VE [4,10,12,16]. Further, they can be better applied together with redirection techniques, as introduced by Razzaque [11].…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…In particular, Haptic Feedback Luminance Modulation [4] PulseLight and PulseVibe [16] VRHapticDrones [7] FacePush [1] GazeRecall [12] Visual Stimuli [2] Diegetic Cues [13] Scaptics [10] Directing vs. Attracting [14] Arrow, Butterfly, Radar [18] Attention Funnel [17] Rubber Band [15] This work and Temporal Luminance Modulation are generally considered to be AG methods that are not distracting for the user while navigating in a VE [4,10,12,16]. Further, they can be better applied together with redirection techniques, as introduced by Razzaque [11].…”
Section: Related Workmentioning
confidence: 99%
“…Following the current state-of-the-art [2,14,18,19,22], we measure the search time between step 3 and 5 for the data collection. Using the search time for all 10 target objects, we then calculate the mean and the median values.…”
Section: Taskmentioning
confidence: 99%
See 2 more Smart Citations
“…It can therefore be assumed that egocentric target cueing leads to less effort in visualspatial information processing than exocentric cueing and thus enables a faster and more precise localization of off-screen targets [7,43,52]. However, other works have shown the opposite [44,53] or could not prove differences between arrows and maps as was recently shown for visual attention guidance in virtual reality (VR) environments [57]. Thus, there is no evidence that exclusively an egocentric perspective leads to efficient localization of off-screen targets.…”
Section: Introductionmentioning
confidence: 99%