Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems 2020
DOI: 10.1145/3313831.3376449
|View full text |Cite
|
Sign up to set email alerts
|

Quantification of Users' Visual Attention During Everyday Mobile Device Interactions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2020
2020
2025
2025

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 21 publications
(16 citation statements)
references
References 51 publications
0
16
0
Order By: Relevance
“…Audio recording is done with a ReSpeaker 3 Mic Array v2.0. To record the gaze and fixations necessary to generate labels for training and validation of the model, a Pupil Labs Core eye tracker 4 is utilized.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Audio recording is done with a ReSpeaker 3 Mic Array v2.0. To record the gaze and fixations necessary to generate labels for training and validation of the model, a Pupil Labs Core eye tracker 4 is utilized.…”
Section: Methodsmentioning
confidence: 99%
“…While such filtering of information is an important contributor to the efficiency of human cognition, it can also occasionally lead to missed information. Modeling attention is of interest for Human-Computer Interaction (HCI) research [4,30], for example to evaluate interfaces, to create attention-adaptive interfaces or for tutoring systems.…”
Section: Introductionmentioning
confidence: 99%
“…To understand various aspects of attention in real-world mobile HCI tasks in an effort to design interfaces for limited attention spans, HCI researchers have investigated attention allocation on the device or environment based on task levels [15,68,83]. For example, establishing that users' attention span is 4 to 8 seconds on mobile devices [68] has helped designers to size information chunks accordingly such that information can be effectively consumed in a short duration or glance.…”
Section: Multitasking Attention Fragmentation and Visual Behaviorsmentioning
confidence: 99%
“…This functionality is supported by Mobile enterprise application platforms or integrated development environments. The past researches suggested that, mobile devices have been evaluated from different perspectives and in different scenario such as for healthcare architecture (Liu et al, 2019), privacy and usability in mobile health systems (Katusiime & Pinkwart, 2019), and measuring the visual screen time for daily routing mobile applications (Bâce et al, 2020).…”
Section: Mobile Devicesmentioning
confidence: 99%