2020 15th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2020) 2020
DOI: 10.1109/fg47880.2020.00141
|View full text |Cite
|
Sign up to set email alerts
|

How are you feeling? Multimodal Emotion Learning for Socially-Assistive Robot Navigation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3
1

Relationship

2
7

Authors

Journals

citations
Cited by 10 publications
(6 citation statements)
references
References 48 publications
0
6
0
Order By: Relevance
“…An adjacent topic is the integration of emotional cues into Human Robot Interaction [322] and potentially robot navigation. Narayanan et al [323] inferred overall mood from gaits, which were used to adapt the robot´s path and Bera et al [324] used trajectories and faces, as did Gines et al [325]. Yelwande et al [326] compared different path planning strategies that utilise information about the human´s emotional state.…”
Section: Discussionmentioning
confidence: 99%
“…An adjacent topic is the integration of emotional cues into Human Robot Interaction [322] and potentially robot navigation. Narayanan et al [323] inferred overall mood from gaits, which were used to adapt the robot´s path and Bera et al [324] used trajectories and faces, as did Gines et al [325]. Yelwande et al [326] compared different path planning strategies that utilise information about the human´s emotional state.…”
Section: Discussionmentioning
confidence: 99%
“…Facial expression recognition has been well studied in human-robot interaction (HRI) (e.g., Cid et al, 2014 ; Meghdari et al, 2016 ; Simul et al, 2016 ; Bera et al, 2019 ). As deep learning methods have become popular, facial expression recognition nowadays mostly consists of preprocessing the facial images and directly feeding them into deep networks to predict an output ( Li and Deng, 2020 ).…”
Section: Related Workmentioning
confidence: 99%
“…Similarly, Bera et al [13] use different data inputs captured from different perspectives, from the robot's camera and a surveillance camera. They propose a system for socially-aware navigation to analyze the emotional behavior from people's faces and trajectories.…”
Section: Related Workmentioning
confidence: 99%