2016
DOI: 10.1007/s00779-016-0969-x
|View full text |Cite
|
Sign up to set email alerts
|

Look together: using gaze for assisting co-located collaborative search

Abstract: Gaze information provides indication of users focus which complements remote collaboration tasks, as distant users can see their partner's focus. In this paper, we apply gaze for co-located collaboration, where users' gaze locations are presented on the same display, to help collaboration between partners. We integrated various types of gaze indicators on the user interface of a collaborative search system, and we conducted two user studies to understand how gaze enhances coordination and communication between… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
53
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
6
2
1

Relationship

2
7

Authors

Journals

citations
Cited by 82 publications
(53 citation statements)
references
References 30 publications
0
53
0
Order By: Relevance
“…However, the allowed gaze estimation error depends on the sizes of the gaze targets. These gaze targets could be fine-level details on the screen [11], closely connected [77] or separate content on the screen [10], large physical objects [2], or rough gaze direction [45,69]. Gaze-based user modelling and passive eye monitoring require gaze estimation to detect gaze patterns instead of individual points.…”
Section: Gaze Applicationsmentioning
confidence: 99%
“…However, the allowed gaze estimation error depends on the sizes of the gaze targets. These gaze targets could be fine-level details on the screen [11], closely connected [77] or separate content on the screen [10], large physical objects [2], or rough gaze direction [45,69]. Gaze-based user modelling and passive eye monitoring require gaze estimation to detect gaze patterns instead of individual points.…”
Section: Gaze Applicationsmentioning
confidence: 99%
“…Systems such as Looking Glass [26], Media Ribbon [2], StrikeAPose [34] and MyPosition [31] support multi-user interaction by mid-air gestures. Gaze-enabled public displays have also started to support multiple users [12,16,38]. Other displays allow interaction via mobile devices [14,18,32].…”
Section: Multi-user Interaction On Public Displaysmentioning
confidence: 99%
“…In the vast majority of cases people approach public displays in groups [7,11,26], which led to an increasing number of very large displays to allow interaction by multiple users [1,2]. Multi-user interactive public displays often respond to individual users by assigning a visual representation to each user [17,33,38]. The past years witnessed an extensive employment of user representations on the display.…”
Section: Introductionmentioning
confidence: 99%
“…A prerequisite of reaching a shared team understanding is to attain joint attention in the team to initiate a dialogue (Harvey, 2014). Joint attention may be defined as participant's being mutually oriented to a common part of their visible environment, and are aware that their conversational partners are also looking at it (Whittaker & O'Conaill, 1997;Zhang et al, 2017).…”
Section: Communicative Resources and Joint Attentionmentioning
confidence: 99%