28th International Conference on Intelligent User Interfaces 2023
DOI: 10.1145/3581754.3584179
|View full text |Cite
|
Sign up to set email alerts
|

Interactive Fixation-to-AOI Mapping for Mobile Eye Tracking Data based on Few-Shot Image Classification

Abstract: Mobile eye tracking is an important tool in psychology and humancentred interaction design for understanding how people process visual scenes and user interfaces. However, analysing recordings from mobile eye trackers, which typically include an egocentric video of the scene and a gaze signal, is a time-consuming and largely manual process. To address this challenge, we propose a web-based annotation tool that leverages few-shot image classification and interactive machine learning (IML) to accelerate the anno… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 27 publications
0
3
0
Order By: Relevance
“…Silva Machado et al (2019) matched the detected bounding boxes with participants' fixations using a sliding-window approach with a MobileNet CNN model. Rather than using the model for inference on large data for automatic analysis, several tools are multitask and offer interactive visualization for manual annotation (Barz et al, 2023;Kurzhals, 2021;Kurzhals et al, 2017Kurzhals et al, , 2020Panetta et al, 2019). Kurzhals et al (2017) an interactive labeling tool with automatic clustering combined with an analysis system.…”
Section: Deep Learning For Eye Tracking Data Analysismentioning
confidence: 99%
See 2 more Smart Citations
“…Silva Machado et al (2019) matched the detected bounding boxes with participants' fixations using a sliding-window approach with a MobileNet CNN model. Rather than using the model for inference on large data for automatic analysis, several tools are multitask and offer interactive visualization for manual annotation (Barz et al, 2023;Kurzhals, 2021;Kurzhals et al, 2017Kurzhals et al, , 2020Panetta et al, 2019). Kurzhals et al (2017) an interactive labeling tool with automatic clustering combined with an analysis system.…”
Section: Deep Learning For Eye Tracking Data Analysismentioning
confidence: 99%
“…They went on to develop image-based (Kurzhals et al, 2020) and gaze patches techniques (Kurzhals, 2021) for dynamic AOI annotation that are conceptually similar to our proposed idea of merging gaze data with video. Barz et al (2023) have implemented an approach based both on image classification and object detection. They used a few-shot learning method for its adaptability, with a 50-layer CNN (ResNet50).…”
Section: Deep Learning For Eye Tracking Data Analysismentioning
confidence: 99%
See 1 more Smart Citation