2020
DOI: 10.3390/s20236716
|View full text |Cite
|
Sign up to set email alerts
|

Using a Social Robot to Evaluate Facial Expressions in the Wild

Abstract: In this work an affective computing approach is used to study the human-robot interaction using a social robot to validate facial expressions in the wild. Our global goal is to evaluate that a social robot can be used to interact in a convincing manner with human users to recognize their potential emotions through facial expressions, contextual cues and bio-signals. In particular, this work is focused on analyzing facial expression. A social robot is used to validate a pre-trained convolutional neural network … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 24 publications
(16 citation statements)
references
References 48 publications
0
16
0
Order By: Relevance
“…Regarding the computer resources used to recognize facial expressions, two implementations are reported using basic hardware computers [ 34 , 39 ] and computers with graphic cards [ 30 , 31 ]. It is worth noting the use of robotic platforms such as the NAO [ 27 , 28 , 33 , 38 ], R-50 Alice [ 35 , 40 ], Pepper [ 32 ], kiwi [ 44 ], and N-Maria [ 42 ].…”
Section: Discussion and Conclusionmentioning
confidence: 99%
See 1 more Smart Citation
“…Regarding the computer resources used to recognize facial expressions, two implementations are reported using basic hardware computers [ 34 , 39 ] and computers with graphic cards [ 30 , 31 ]. It is worth noting the use of robotic platforms such as the NAO [ 27 , 28 , 33 , 38 ], R-50 Alice [ 35 , 40 ], Pepper [ 32 ], kiwi [ 44 ], and N-Maria [ 42 ].…”
Section: Discussion and Conclusionmentioning
confidence: 99%
“…Another example is presented by Ramis et al in [ 33 ]. The authors proposed an algorithm to recognize the emotions of people using the NAO robot, employing a game: (a) the VJ algorithm carries out face detection; (b) facial landmarks are located to calculate the center of each eye and the distance between them; (c) a CNN processes the image to determine whether the expression relates to happiness, sadness, disgust, anger, surprise, fear, or indifference.…”
Section: Algorithms Used For Face Recognition and Trackingmentioning
confidence: 99%
“…In the past decades, many researchers have investigated subjects’ reactions and potentials of different emotion recognition techniques, including speech, non-verbal audition, facial expression, visual and thermal images, peripheral neural signals, and central neural system signals [ 29 , 30 , 31 , 32 , 33 , 34 , 35 , 36 , 37 ].…”
Section: Related Workmentioning
confidence: 99%
“…Today the need for emotion classification has surpassed the barrier of age because of the global shift towards online platforms such as online education to teach or gain knowledge virtually globally to all remote areas, IoT enabled health monitoring systems and temperature setters in cars and households, robotics, psychiatric evaluation based on violent behaviors of criminals or those mentally disturbed, mood swings study on adolescents to help guide them mentally, deepfake detection, gaming, and many such applications are currently being innovated using state of the art technologies. Also, numerous studies have been conducted on Facial Emotion Recognition by using Computer vision because of its practicality in intelligent robotics, health-related treatment, IoT, Security surveillance, criminal psychological analysis, observation of driver exhaustion, and other human-computer interfaces mechanisms [7][8][9]. With more virtual connectivity through videos and images, the need to adopt the latest technology based on people's emotions is now a critical factor in driving user-friendliness and maximum user satisfaction.…”
Section: Introductionmentioning
confidence: 99%