Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems 2016
DOI: 10.1145/2851581.2890247
|View full text |Cite
|
Sign up to set email alerts
|

Affdex SDK

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
39
0
2

Year Published

2017
2017
2020
2020

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 233 publications
(41 citation statements)
references
References 9 publications
0
39
0
2
Order By: Relevance
“…[14]), and workload (Driving Activity Load Index [38]). We furthermore analyzed the driver's facial expressions regarding displayed emotions using the Affdex SDK [30] and assessed the propriety of the used matching algorithm, i.e., whether or not the recommended character matched with the preferred character. At the end of the experiment, participants answered questions on the experienced characters in a semi-structured interview and assessed the perceived personalities using a semantic differential rating.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…[14]), and workload (Driving Activity Load Index [38]). We furthermore analyzed the driver's facial expressions regarding displayed emotions using the Affdex SDK [30] and assessed the propriety of the used matching algorithm, i.e., whether or not the recommended character matched with the preferred character. At the end of the experiment, participants answered questions on the experienced characters in a semi-structured interview and assessed the perceived personalities using a semantic differential rating.…”
Section: Methodsmentioning
confidence: 99%
“…We evaluate the output of the Affdex facial expression detection system [30] for a 10-second time frame after each interaction. This duration was chosen as literature states emotional responses to auditory stimuli are processed against preexisting expectations within less than 10 seconds after Table 3: Emotion detection values for a 10-second time frame after experiencing the characters.…”
Section: Emotion Recognitionmentioning
confidence: 99%
“…Affdex performs automatic facial coding in four steps: face and facial landmark detection, face feature extraction, facial action, and emotion expression modeling based on the EMFACS emotional facial action coding system (Ekman and Friesen, 1978; Friesen and Ekman, 1983; McDuff et al, 2016). Although no data has been published yet specifically comparing the performance of the software on adults vs. children, FACS coding is generally the same for adults and for children and has been used with children as young as 2 years (e.g., Camras et al, 2006; LoBue and Thrasher, 2015; also see Ekman and Rosenberg, 1997).…”
Section: Methodsmentioning
confidence: 99%
“…Children’s affect data were collected using Affdex whenever a face was detected with the front-facing camera on the Samsung Galaxy S4 device (McDuff et al, 2016). Affdex is capable of measuring 15 expressions, which are used to calculate the likelihood that the detected face is displaying each of nine different affective states.…”
Section: Methodsmentioning
confidence: 99%
“…Facial expressions recognition algorithms infer emotions using both geometric (position of nose, eyes, mouth), and appearance (pixels values) features of the face. These emotions are used by some tools such as FaceReader (Loijens and Krips, 2019), Microsoft Emotion API (Microsoft.com, n.d.), Affdex SDK (McDuff et al, 2016) and Google Emotion API (Google.com, n.d.) to recognize emotions by providing numerical values 5 associated to them. Some tools show high accuracy in detecting basic emotions, but only on specific datasets (e.g.…”
Section: Emotion Recognition and Learning Analyticsmentioning
confidence: 99%