Gesture-Based Programming is a paradigm for programming robots by human demonstration in which the human demonstrator directs the self-adaptation of executable software. The goal is to provide a more natural environment for the user as programmer and to generate more complete and successful programs by focusing on task experts rather than programming experts. We call the paradigm "gesture-based" because we try to enable the system to capture, in real-time, the intention behind the demonstrator's fleeting, context-dependent hand motions, contact conditions, finger poses, and even cryptic utterances in order to reconfigure itself. The system is self-adaptive in the sense that knowledge of previously acquired skills (sensorimotor expertise) is retained by the system and this knowledge facilitates the interpretation of the gestures during training and then provides feedback control during run-time.
INTRODUCTIONThe text-based programming paradigm relies exclusively on the expertise of the human programmer and his or her ability to learn from and remember past lessons on system development. Beyond on-line help files, the applications themselves and the programming environments that facilitate their creation acquire no knowledge of themselves and carry-over no useful knowledge from project to project or even within the development of a single project. The burden of retaining the expertise that results from the creation of new applications or new configurations of existing applications falls entirely on the human programmer. The idea behind self-adaptive software is to empower the system itself to participate in its own development. If a software environment can retain a knowledge base of its own capabilities and usefulness, it can assist the human developer during reapplication or reconfiguration and potentially perform autonomous adaptation. The allowable degree of "self-determination" will certainly vary across applications and operating environments. For example, the user interface to a consumer application program may have wide latitude for self-adaptation on an ongoing basis. A manufacturing system, on the other hand, would likely have rigid constraints on the degree of allowable self-adaptation and even on the periods during which self-adaptation is enabled.
GESTURE-BASED PROGRAMMING FOR ROBOTICSThe focus of this paper is not the grand goal of self-adaptive software, but a smaller step toward it we call human-augmented adaptation. We further limit our efforts to the task of robot programming. In particular, we are interested in the programming paradigm itself and in finding alternative approaches to programming that are more intuitive for human users. Our basic model is human-to-human training, which is very intuitive to users. In this model, the trainee is active in the process, so, when applied to the programming of robotic systems, it naturally leads to systems that are more participative in their own development. In this sense, human-augmented adaptation is an important step toward, and a useful compl...