2021
DOI: 10.3390/app11104639
|View full text |Cite
|
Sign up to set email alerts
|

Expressing Robot Personality through Talking Body Language

Abstract: Social robots must master the nuances of human communication as a mean to convey an effective message and generate trust. It is well-known that non-verbal cues are very important in human interactions, and therefore a social robot should produce a body language coherent with its discourse. In this work, we report on a system that endows a humanoid robot with the ability to adapt its body language according to the sentiment of its speech. A combination of talking beat gestures with emotional cues such as eye li… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(10 citation statements)
references
References 45 publications
0
10
0
Order By: Relevance
“…Based on these gestures’ affective state and meaning in the context of the sentence, a gesture ranker selects one gesture to be generated. Zabala et al [ 29 ] presented a method for adapting beat gestures, lights in the eyes, body posture, and vocal intonation and volume to the sentiment of a humanoid robot’s speech. The system first retrieves the valence score for each word in the script using the VADER sentiment analyser.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Based on these gestures’ affective state and meaning in the context of the sentence, a gesture ranker selects one gesture to be generated. Zabala et al [ 29 ] presented a method for adapting beat gestures, lights in the eyes, body posture, and vocal intonation and volume to the sentiment of a humanoid robot’s speech. The system first retrieves the valence score for each word in the script using the VADER sentiment analyser.…”
Section: Related Workmentioning
confidence: 99%
“…Another interface that is present in multiple robots is coloured LEDs, usually combined with other modalities. This can be seen in the approaches of Gomez et al [ 39 ], Hong et al [ 40 ], or Zabala et al [ 29 ].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Many human activities involve “effort”—albeit defined differently in different contexts—and therefore the idea may apply to a range of human behaviour other than body motion. Zabala et al (2021) proposed an affect-based system that generates non-verbal behaviour consistent with the robot’s speech. Their system does not have components to encode inter- and intra-individual differences of personalities by traits but works by generating non-verbal behaviour from the emotions expressed in the robot’s speech.…”
Section: Overview Of the Current Generative Personality Modelsmentioning
confidence: 99%
“…Although there is extensive research on the detection of user gestures 5 , robot-generated gestures are only rarely studied in human–robot interaction (HRI) and are frequently combined with speech 6 . In particular, gesture-based interactions have mainly been studied for story-telling robots 7 , 8 , social interactions 9 , 10 , and educational activities for children 11 . However, nonverbal cues have not yet been thoroughly investigated for helping autonomous robots guide humans to perform physical tasks.…”
Section: Introductionmentioning
confidence: 99%