2020
DOI: 10.3389/fpsyg.2020.00920
|View full text |Cite
|
Sign up to set email alerts
|

The Facial Action Coding System for Characterization of Human Affective Response to Consumer Product-Based Stimuli: A Systematic Review

Abstract: To characterize human emotions, researchers have increasingly utilized Automatic Facial Expression Analysis (AFEA), which automates the Facial Action Coding System (FACS) and translates the facial muscular positioning into the basic universal emotions. There is broad interest in the application of FACS for assessing consumer expressions as an indication of emotions to consumer product-stimuli. However, the translation of FACS to characterization of emotions is elusive in the literature. The aim of this systema… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
32
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 97 publications
(33 citation statements)
references
References 148 publications
(165 reference statements)
0
32
0
1
Order By: Relevance
“…However, our affective displays in reality are much more complicated and subtle compared to the simplicity of these universal emotions. To represent the complexity of the emotional spectrum, many approaches were proposed such as the Facial Action Coding System [8], where all facial actions are described in terms of Action Units (AUs); or dimensional models [46], where affection is quantified by values chosen over continuous emotional scales like valence and arousal. Nevertheless, those models which use discrete affections are the most popular in automatic emotion recognition task because they are easier to interpret and more intuitive to human.…”
Section: Human Emotionmentioning
confidence: 99%
“…However, our affective displays in reality are much more complicated and subtle compared to the simplicity of these universal emotions. To represent the complexity of the emotional spectrum, many approaches were proposed such as the Facial Action Coding System [8], where all facial actions are described in terms of Action Units (AUs); or dimensional models [46], where affection is quantified by values chosen over continuous emotional scales like valence and arousal. Nevertheless, those models which use discrete affections are the most popular in automatic emotion recognition task because they are easier to interpret and more intuitive to human.…”
Section: Human Emotionmentioning
confidence: 99%
“…The computer webcam recorded the respondent face during each 30–40 sec PSA. These recordings were uploaded to iMotions ® for analysis with the validated AFFDEX automated facial coding engine (Clark et al, 2020; De Lemos, 2007; Ekman & Friesen, 1978; Ekman, 1997; Stöckli et al, 2017). In this paper, “attention,” a measure of head position (pitch, yaw, and roll) relative to the camera, and 20 specific facial expressions (what the software algorithm classifies as, for example, “cheek raise,” “lip curl”) are used.…”
Section: Methodsmentioning
confidence: 99%
“…Researchers at a large southeastern university were awarded a grant from the state’s anti-tobacco agency to explore augmenting opinion-based surveys with data collected using biometric sensors (Al-Turk et al, 2018; Huseynov et al, 2019; Venkatraman et al, 2015; Cartocci et al, 2017). Our goal was to determine the value added by physiological neuro-metric tools (galvanic skin response, eye tracking, and facial expression analysis (FEA), (Clark et al, 2020; Hamelin et al, 2017; Joanne et al, 2019; Lewinski et al, 2014; Kong et al, 2020; Schmälzle & Meshi, 2020) to on-line surveys that use subjective measures for message pretesting.…”
Section: Introductionmentioning
confidence: 99%
“…Facial expressions have long been used to indicate emotions and stayed central in emotion studies (Tomkins and McCarter, 1964;Russell, 1994;Ruba and Repacholi, 2020). The accuracy of using facial expressions to identify emotions has been justified through many ways, for instance, self-report instruments (Matsumoto, 1987;Matsumoto et al, 2000) and facial coding systems (Ekman et al, 1980;Clark et al, 2020;Rosenberg and Ekman, 2020). There is debate around the universality of facial expression.…”
Section: Facial Expression and Emotionmentioning
confidence: 99%