Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems 2023
DOI: 10.1145/3544548.3580866
|View full text |Cite
|
Sign up to set email alerts
|

Dynamics of eye-hand coordination are flexibly preserved in eye-cursor coordination during an online, digital, object interaction task

Abstract: Do patterns of eye-hand coordination observed during real-world object interactions apply to digital, screen-based object interactions? We adapted a real-world object interaction task (physically transferring cups in sequence about a tabletop) into a two-dimensional screen-based task (dragging-and-dropping circles in sequence with a cursor). We collected gaze (with webcam eye-tracking) and cursor position data from 51 fully-remote, crowd-sourced participants who performed the task on their own computer. We app… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
2
1
1
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 53 publications
(87 reference statements)
0
2
0
Order By: Relevance
“…Using custom MATLAB scripts and our Gaze and Movement Analysis software, our initial data cleaning approach aimed to assess whether the gaze data showed reasonable patterns or whether it contained noisy, spurious gaze prediction errors (see [4] for a similar approach). Using more exaggerated but still mutuallyexclusive boundaries (see Figure 3B -AOI Dwell Boundaries), we determined whether the gaze fell inside at least one of the task-critical areas during the decision period.…”
Section: Data Processingmentioning
confidence: 99%
See 1 more Smart Citation
“…Using custom MATLAB scripts and our Gaze and Movement Analysis software, our initial data cleaning approach aimed to assess whether the gaze data showed reasonable patterns or whether it contained noisy, spurious gaze prediction errors (see [4] for a similar approach). Using more exaggerated but still mutuallyexclusive boundaries (see Figure 3B -AOI Dwell Boundaries), we determined whether the gaze fell inside at least one of the task-critical areas during the decision period.…”
Section: Data Processingmentioning
confidence: 99%
“…A major drawback shared by all of the aforementioned eye-tracking studies is their confinement to laboratory settings. Recently, however, the use of webcam eye-tracking has emerged as a promising avenue in bridging the gap between controlled laboratory experiments and data collected in a wide range of environments (e.g., [49,47,4]). Admittedly, webcam eye-tracking is still a method in its infancy and has notable limitations in both temporal and spatial accuracy [40,3,49].…”
Section: Introductionmentioning
confidence: 99%