Abstract-For humanoid robots which are able to assist humans in their daily life, the capability for adequate interaction with human operators is a key feature. If one considers that more than 60% of human communication is conducted non-verbally (by using facial expressions and gestures), an important research topic is how interfaces for this non-verbal communication can be developed. To achieve this goal, several robotic heads have been designed. However, it remains unclear how exactly such a head should look like and what skills it should have to be able to interact properly with humans. This paper describes an approach that aims at answering some of these design choices. A behaviorbased control to realize facial expressions which is a basic ability needed for interaction with humans is presented. Furthermore a poll in which the generated facial expressions should be detected is visualized. Additionally, the mechatronical design of the head and the accompanying neck joint are given.Index Terms-humanoid robot head, facial expressions, mechanical design, behavior based control
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.