In the field of human-robot interaction, it is the key problem for the robot to know the information of the interactive partner. Without obtaining these information the robot couldn't realize what is the partner means. Currently most of the interactive robots have vision sensors to catch human face, and then perform some reactions to the interactive partner. But it could only access under strict conditions because when the partner get out of the sight, the robot couldn't know where to find him. So it is important for the robot to find other information indicated by partner. Some of the interactive robot has audition sensors to localize the sound source. The robot could recognize the location of the sound source produced by partner and try to face him. But the audition system need more computation and hardware resources, so it's hard to be embed in a existing systems. Therefore, we try to find another solution. In this paper, we presented a tracking system with pointing gesture recognition for human-robot interaction. By showing fingers we can provide information for the robot to know that there is another person who wants to interact with it. Finally the whole system will implement at our humanoid robot head.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.