Proceedings of the 4th ACM/IEEE International Conference on Human Robot Interaction 2009
DOI: 10.1145/1514095.1514185
|View full text |Cite
|
Sign up to set email alerts
|

A preliminary system for recognizing boredom

Abstract: A 3D optical flow tracking system was used to track participants as they watched a series of boring videos. The video stream of the participants was rated for boredom events. Ratings and head position data were combined to predict boredom events.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2012
2012
2019
2019

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 2 publications
0
2
0
Order By: Relevance
“…This approach, however, relies on the honesty of the students, has the potential of interrupting the primary task of learning, and cues students to the fact that the tutor is explicitly monitoring their boredom levels. Alternatively, sensors that track facial features, physiology, reaction time, and other informational channels could be used to corroborate the gazebased diagnosis of boredom (Beck, 2005;Cocea & Weibelzahl, 2009;D'Mello & Graesser, 2010b;Drummond & Litman, 2010;Jacobs et al, 2009).…”
Section: Limitations and Future Workmentioning
confidence: 99%
“…This approach, however, relies on the honesty of the students, has the potential of interrupting the primary task of learning, and cues students to the fact that the tutor is explicitly monitoring their boredom levels. Alternatively, sensors that track facial features, physiology, reaction time, and other informational channels could be used to corroborate the gazebased diagnosis of boredom (Beck, 2005;Cocea & Weibelzahl, 2009;D'Mello & Graesser, 2010b;Drummond & Litman, 2010;Jacobs et al, 2009).…”
Section: Limitations and Future Workmentioning
confidence: 99%
“…These categories are based primarily on physical actions, and coded from videos taken of the entire experiments, a technique used in coding attention states in previous studies [71,72]. We recognize that they are not absolute in that just because someone was looking at a screen, this does not mean he or she was cognitively engaged with a UAV control activity.…”
Section: Attention Statesmentioning
confidence: 99%