Extended Abstracts of the Annual Symposium on Computer-Human Interaction in Play Companion Extended Abstracts 2019
DOI: 10.1145/3341215.3356275
|View full text |Cite
|
Sign up to set email alerts
|

Lost & Found: Gaze-based Player Guidance Feedback in Exploration Games

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 21 publications
0
2
0
Order By: Relevance
“…In the field of visual attention guidance, many different methods have been proposed in recent years for application in images as well as videos; in non‐immersive settings [DMGB04, BMSG09, VMFS11, HKS16], in VR settings using Head‐Mounted Display (HMD) [LCH*17, GSEM17, GAM18, RAK18, GTA*19], and even in an immersive room‐scale projection system [GATM18, GTA*19]. Attention guidance as a support means may be helpful in various application scenarios, such as virtual training or remote teaching [dKJ17, FMS*19, YKB19b, YKB19a], guided exploration [LH19], multi‐monitor surveillance tasks [SKB19] or immersive story telling [SP19, SRD*19, LSGB20]. For our survey on related work, we will distinguish between passive and active methods, that is, they do or do not actively incorporate real‐time gaze tracking data.…”
Section: Related Workmentioning
confidence: 99%
“…In the field of visual attention guidance, many different methods have been proposed in recent years for application in images as well as videos; in non‐immersive settings [DMGB04, BMSG09, VMFS11, HKS16], in VR settings using Head‐Mounted Display (HMD) [LCH*17, GSEM17, GAM18, RAK18, GTA*19], and even in an immersive room‐scale projection system [GATM18, GTA*19]. Attention guidance as a support means may be helpful in various application scenarios, such as virtual training or remote teaching [dKJ17, FMS*19, YKB19b, YKB19a], guided exploration [LH19], multi‐monitor surveillance tasks [SKB19] or immersive story telling [SP19, SRD*19, LSGB20]. For our survey on related work, we will distinguish between passive and active methods, that is, they do or do not actively incorporate real‐time gaze tracking data.…”
Section: Related Workmentioning
confidence: 99%
“…Adaptivity provides an opportunity to provide individual user experience (UX) and increase flow, presence, and immersion. Related work has shown, that behavioral indicators and biometric data for hand [6,9] and foot movements [5,8], as well as gaze behavior [7,11,12], can be used to adapt virtual environments and virtual reality (VR) [2,4]. The use of VR head-mounted displays (HMDs) provides the possibility to increase these effects by encapsulating users to their own environment.…”
Section: Introductionmentioning
confidence: 99%