This paper presents the design, control, and emotion expressions capabilities of the robotic head EMYS. The concept of motion control system based on FACS theory is proposed. On the basis of this control system six basics emotions are designed for EMYS head. The proposed head shapes are verified in experiments with participation of children aged 8-12. The results of the experiments, perception of the proposed design, and control system are discussed.
In social robotics it has been a crucial issue to determine the minimal set of relevant behaviour actions that humans interpret as social competencies. As a potential alternative of mimicking human abilities, it has been proposed to use a non-human animal, the dog as a natural model for developing simple, non-linguistic emotional expressions for non-humanoid social robots. In the present study human participants were presented with short video sequences in which a PeopleBot robot and a dog displayed behaviours that corresponded to five emotional states (joy, fear, anger, sadness, and neutral) in a neutral environment. The actions of the robot were developed on the basis of dog expressive behaviours that had been described in previous studies of dog-human interactions. In their answers to open-ended questions, participants spontaneously attributed emotional states to both the robot and the dog.They could also successfully match all dog videos and all robot videos with the correct emotional state. We conclude that our bottom up approach (starting from a simpler animal signalling system, rather than decomposing complex human signalling systems) can be used as a promising model for developing believable and easily recognisable emotional displays for non-humanoid social robots.
Highlights:Humans spontaneously attribute emotions to an ethologically inspired robot 2 Dog emotional videos prime the attribution of emotions to robot videos Participants were able to match both dog and robot videos to the corresponding emotions Experience with dogs does not help identify dog and robot emotions
This study investigated whether dogs would engage in social interactions with an unfamiliar robot, utilize the communicative signals it provides and to examine whether the level of sociality shown by the robot affects the dogs' performance. We hypothesized that dogs would react to the communicative signals of a robot more successfully if the robot showed interactive social behaviour in general (towards both humans and dogs) than if it behaved in a machinelike, asocial way. The experiment consisted of an interactive phase followed by a pointing session, both with a human and a robotic experimenter. In the interaction phase, dogs witnessed a 6-min interaction episode between the owner and a human experimenter and another 6-min interaction episode between the owner and the robot. Each interaction episode was followed by the pointing phase in which the human/robot experimenter indicated the location of hidden food by using pointing gestures (two-way choice test). The results showed that in the interaction phase, the dogs' behaviour towards the robot was affected by the differential exposure. Dogs spent more time staying near the robot experimenter as compared to the human experimenter, with this difference being even more pronounced when the robot behaved socially. Similarly, dogs spent more time gazing at the head of the robot experimenter when the situation was social. Dogs achieved a significantly lower level of performance (finding the hidden food) with the pointing robot than with the pointing human; however, separate analysis of the robot sessions suggested that gestures of the socially behaving robot were easier for the dogs to comprehend than gestures of the asocially behaving robot. Thus, the level of sociality shown by the robot was not enough to elicit the same set of social behaviours from the dogs as was possible with humans, although sociality had a positive effect on dog-robot interactions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.