2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII) 2017
DOI: 10.1109/acii.2017.8273588
|View full text |Cite
|
Sign up to set email alerts
|

Discovering gender differences in facial emotion recognition via implicit behavioral cues

Abstract: We examine the utility of implicit behavioral cues in the form of EEG brain signals and eye movements for gender recognition (GR) and emotion recognition (ER). Specifically, the examined cues are acquired via low-cost, off-the-shelf sensors. We asked 28 viewers (14 female) to recognize emotions from unoccluded (no mask) as well as partially occluded (eye and mouth masked) emotive faces. Obtained experimental results reveal that (a) reliable GR and ER is achievable with EEG and eye features, (b) differential co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

1
10
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
3
2
1

Relationship

2
4

Authors

Journals

citations
Cited by 12 publications
(11 citation statements)
references
References 27 publications
1
10
0
Order By: Relevance
“…Conspicuous facial cues are studied in [18] to detect multimedia highlights, while implicit physiological measurements are employed to model emotions induced by music and movie scenes respectively in [20] and [3]. EEG and eye movements are two popular implicit modalities employed for ER, and many works have predicted a ect with a combination of both [7,30,58], or exclusively using either signal [3,19,27,40,46,47,59].…”
Section: User-centered Ermentioning
confidence: 99%
See 3 more Smart Citations
“…Conspicuous facial cues are studied in [18] to detect multimedia highlights, while implicit physiological measurements are employed to model emotions induced by music and movie scenes respectively in [20] and [3]. EEG and eye movements are two popular implicit modalities employed for ER, and many works have predicted a ect with a combination of both [7,30,58], or exclusively using either signal [3,19,27,40,46,47,59].…”
Section: User-centered Ermentioning
confidence: 99%
“…Conspicuous facial cues are studied to detect multimedia highlights in [9], while physiological measurements are utilized to model emotions induced by music and movie scenes in [29], [30]. EEG and eye movements are two popular modalities employed for ER, and many works have used a combination of both [20], [27], [28] or either signal exclusively [21], [24], [29], [31], [36], [41], [42].…”
Section: User-centered Ermentioning
confidence: 99%
See 2 more Smart Citations
“…By capitalising on the P300, a computer can detect when a target letter flashes on a screen thereby allowing selection of letters without physical interaction. Though therewere several EPOC studies in this review that investigated traditional P300 speller BCI interfaces[34][35][36][37], others harnessed P300 for such purposes as interacting with navigation systems[38] and robotic devices[39]. Suhas, Dhal (40) investigated using ERPs to control a light bulb and a fan, with an eye towards giving physically disabled individuals control of 'smart' appliances.…”
mentioning
confidence: 99%