Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems 2016
DOI: 10.1145/2858036.2858137
|View full text |Cite
|
Sign up to set email alerts
|

Gaze-based Notetaking for Learning from Lecture Videos

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(12 citation statements)
references
References 25 publications
0
12
0
Order By: Relevance
“…Taking eye gaze from users as a command to computers is the most intuitive gaze-aware application. The typical usage of eye gaze information is as a replacement of the mouse, such as typing words with the eye [40], indicating user attention [44], and selecting items [76]. Researchers have investigated daily human-computer interactions using different eye movements, such as fixations [16,35,40], smooth pursuit [13], and eye gestures [37].…”
Section: Gaze-based Human-computer Interactionmentioning
confidence: 99%
See 1 more Smart Citation
“…Taking eye gaze from users as a command to computers is the most intuitive gaze-aware application. The typical usage of eye gaze information is as a replacement of the mouse, such as typing words with the eye [40], indicating user attention [44], and selecting items [76]. Researchers have investigated daily human-computer interactions using different eye movements, such as fixations [16,35,40], smooth pursuit [13], and eye gestures [37].…”
Section: Gaze-based Human-computer Interactionmentioning
confidence: 99%
“…Accuracy. Applications that rely on explicit eye input usually require high-accuracy gaze estimates, such as for eye typing [31,40], authentication [26], or system control [44]. However, the allowed gaze estimation error depends on the sizes of the gaze targets.…”
Section: Gaze Applicationsmentioning
confidence: 99%
“…As a result, there is increasing interest in using information about eye gaze not only as a research instrument, but to enhance our everyday interaction with computers. Gaze-enabled applications and interaction techniques range from explicit gaze input, such as pointing [22,49] or gaze gestures [8,16,33] to attentive applications that use gaze to make inferences about the user's intentions [13,29,38] and improve input with other modalities [37]. Surprisingly, little work has been done to understand the requirements of such applications as they integrate into our everyday computer use.…”
Section: Introductionmentioning
confidence: 99%
“…[Aula et al 2005;Dumais et al 2010;Gwizdka 2014;Kajan et al 2016;Salojärvi et al 2003]). Only few researchers have considered more unconstrained settings where ground truth labels are not available, mostly to establish general interest in displayed information [Alt et al 2012;Nguyen and Liu 2016;Qvarfordt and Zhai 2005]. These combine few well-established gaze metrics but are highly tuned to their specific applications.…”
Section: Background and Related Workmentioning
confidence: 99%
“…Several works have analyzed the gaze behavior while exploring a map [Krejtz et al 2017], for example to adapt complex legends to only show relevant items [Göbel et al 2018], to highlight important points of interest to facilitate planning [Göbel and Kiefer 2019], or to follow-up on user's interests [Qvarfordt and Zhai 2005]. Other systems extract relevant information to compile a summary for later use [Buscher et al 2012a;Nguyen and Liu 2016]. UI adaptation based on user's cognitive load or context has recently been shown for mixed-reality settings [Gebhardt et al 2019;Lindlbauer et al 2019] where the design of a good UI can be difficult as the user's context frequently changes, similar to other applications of ubiquitous computing [Dourish 2004].…”
Section: Ui Adaptation From Gaze Behaviormentioning
confidence: 99%