2017
DOI: 10.1016/j.procir.2017.01.014
|View full text |Cite
|
Sign up to set email alerts
|

Applying FaceReader to Recognize Consumer Emotions in Graphic Styles

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
20
0
3

Year Published

2017
2017
2022
2022

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 39 publications
(24 citation statements)
references
References 20 publications
1
20
0
3
Order By: Relevance
“…Significant trends in happiness, surprise, and disgust were observed across the experimental images. Researchers noted that while simple elements and sharp edges showed higher attention and valence, higher valence was observed for pictorial images as opposed to computer-generated images [64]. A study on chocolate packaging did not find any significant differences in AFEA data for the experimental package images [65].…”
Section: Differentiation Of Stimuli By Expressed Emotionmentioning
confidence: 96%
See 1 more Smart Citation
“…Significant trends in happiness, surprise, and disgust were observed across the experimental images. Researchers noted that while simple elements and sharp edges showed higher attention and valence, higher valence was observed for pictorial images as opposed to computer-generated images [64]. A study on chocolate packaging did not find any significant differences in AFEA data for the experimental package images [65].…”
Section: Differentiation Of Stimuli By Expressed Emotionmentioning
confidence: 96%
“…One study used the reduction in negative responses as an indicator of positive preference [63]. Studies on odors and images have shown that arousal/attention may be more important than valence for the characterization of samples by AFEA [51,64].…”
Section: Differentiation Of Stimuli By Expressed Emotionmentioning
confidence: 99%
“…Originally built upon another software called CERT (Littlewort et al, 2011), FACET was distributed by Emotient, whereas FaceReader was developed and first presented by VicarVision in 2005 (Den Uyl andVan Kuilenburg, 2005). Both systems have been used in a large number of scientific studies (e.g., Skiendziel, Rösch, & Schultheiss, 2019; for a review see Lewinski et al, 2014a) as well as in consumer behavior (Garcia-Burgos and Zamora, 2013;Danner et al, 2014;Yu and Ko, 2017) and marketing research (Lewinski et al, 2014b;McDuff et al, 2014). Nonetheless, there are several other promising off-the-shelf classifiers available today that could be employed for the same purposes.…”
Section: Abundant Choices: Classifiers Lack Cross-system Validationmentioning
confidence: 99%
“…This method is to minimize the number of neurons with a probability of 0.2. Next is to calculate the input gate wherein it will decide which value will be updated, as shown in (6)…”
Section: Recurrent Neural Networkmentioning
confidence: 99%
“…Previous studies measured positive emotions and negative emotions that impact on loyalty intentions. Positive emotions are happy, happy, and enthusiastic, while those that include negative emotions are angry, disappointed, and hesitant [5] The introduction of sensations is also used to find out the customer's emotional reactions to different types of images [6].…”
Section: Introductionmentioning
confidence: 99%