Abstract-If robots are to succeed in novel tasks, they must be able to learn from humans. To improve such humanrobot interaction, a system is presented that provides dialog structure and engages the human in an exploratory teaching scenario. Thereby, we specifically target untrained users, who are supported by mixed-initiative interaction using verbal and non-verbal modalities. We present the principles of dialog structuring based on an object learning and manipulation scenario. System development is following an interactive evaluation approach and we will present both an extensible, eventbased interaction architecture to realize mixed-initiative and evaluation results based on a video-study of the system. We show that users benefit from the provided dialog structure to result in predictable and successful human-robot interaction.
This article presents results from a multidisciplinary research project on the integration and transfer of language knowledge into robots as an empirical paradigm for the study of language development in both humans and humanoid robots. Within the framework of human linguistic and cognitive development, we focus on how three central types of learning interact and co-develop: individual learning about one's own embodiment and the environment, social learning (learning from others), and learning of linguistic capability. Our primary concern is how these capabilities can scaffold each other's development in a continuous feedback cycle as their interactions yield increasingly sophisticated competencies in the agent's capacity to interact with others and manipulate its world. Experimental results are summarized in relation to milestones in human linguistic and cognitive development and show that the mutual scaffolding of social learning, individual learning, and linguistic capabilities creates the context, conditions, and requisites for learning in each domain. Challenges and insights identified as a result of this research program are discussed with regard to possible and actual contributions to cognitive science and language ontogeny. In conclusion, directions for future work are suggested that continue to develop this approach toward an integrated framework for understanding these mutually scaffolding processes as a basis for language development in humans and robots.
Abstract-The recognition of gaze as for example mutual gaze plays an important role in social interaction. Previous research shows that already infants are capable of detecting mutual gaze. Such abilities are relevant for robots to learn from interaction, for example detecting when the robot is being addressed. Although various gaze tracking methods have been proposed, few seem to be openly available for robotic platforms such as iCub. In this paper we will describe a gaze tracking system for humanoid robots that is completely based on freely available libraries and data sets. Our system is able to estimate horizontal and vertical gaze directions using low resolution VGA images from robot embodied vision at 30 frames per second. For this purpose we developed a pupil detection algorithm combining existing approaches to increase noise robustness. Our method combines positions of the face and eye features as well as context features such as eyelid correlates and thus does not rely on fixed head orientations. An evaluation on the iCub robot shows that our method is able to estimate mutual gaze with 96% accuracy at 8• tolerance and one meter distance to the robot. The results further support that mutual gaze detection yields higher accuracy in an embodied setup compared to other configurations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.