2019 IEEE Winter Conference on Applications of Computer Vision (WACV) 2019
DOI: 10.1109/wacv.2019.00178
|View full text |Cite
|
Sign up to set email alerts
|

Eyemotion: Classifying Facial Expressions in VR Using Eye-Tracking Cameras

Abstract: One of the main challenges of social interaction in virtual reality settings is that head-mounted displays occlude a large portion of the face, blocking facial expressions and thereby restricting social engagement cues among users. Hence, auxiliary means of sensing and conveying these expressions are needed. We present an algorithm to automatically infer expressions by analyzing only a partially occluded face while the user is engaged in a virtual reality experience. Specifically, we show that images of the us… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
72
1
2

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 105 publications
(75 citation statements)
references
References 43 publications
0
72
1
2
Order By: Relevance
“…Furthermore, deep learning techniques have been thoroughly applied by the participants of these two challenges (e.g., [240], [241], [242]). Additional related real-world applications, such as the Real-time FER App for smartphones [243], [244], Eyemotion (FER using eye-tracking cameras) [245], privacy-preserving mobile analytics [246], Unfelt emotions [247] and Depression recognition [248], have also been developed.…”
Section: Other Special Issuesmentioning
confidence: 99%
“…Furthermore, deep learning techniques have been thoroughly applied by the participants of these two challenges (e.g., [240], [241], [242]). Additional related real-world applications, such as the Real-time FER App for smartphones [243], [244], Eyemotion (FER using eye-tracking cameras) [245], privacy-preserving mobile analytics [246], Unfelt emotions [247] and Depression recognition [248], have also been developed.…”
Section: Other Special Issuesmentioning
confidence: 99%
“…The task is particularly challenging because we only have the user's eye regions as input rather than the entire face, as used in most emotive facial expressions classification benchmarks [26]. To our knowledge, only two previous studies have identified facial expressions from eyes, however, in these studies a view of the eyebrows was available [25] [16]. Eyebrows are considered important for emotive expression.…”
Section: Facial Expression Classificationmentioning
confidence: 99%
“…For the facial expression classification, we collected data from 15 subjects. We trained subjects to perform facial action units and labelled these data using the Facial Action Coding system, a widely recognized method for coding individual facial muscle movements [7]. These facial action units can then be mapped onto emotive expressions.…”
Section: Labelingmentioning
confidence: 99%
“…They also depend on the features of the object of interest [4,5]. Accordingly, bottom-up and top-down approaches are mainly driven by the visual characteristics of a scene and the task of interest, respectively [6,7].…”
Section: Introductionmentioning
confidence: 99%