Mobile manipulation robots are envisioned to provide many useful services both in domestic environments as well as in the industrial context. Examples include domestic service robots, that implement large parts of the housework, and versatile industrial assistants, that provide automation, transportation, inspection, and monitoring services. The challenge in these applications is that the robots have to function under changing, real-world conditions, be able to deal with considerable amounts of noise and uncertainty, and operate without the supervision of an expert. To meet these challenges, current robotic systems are typically custom-tailored to specific applications in welldefined environments, and therefore cannot deal robustly with changes in the situation. This thesis presents novel learning techniques that enable mobile manipulation robots, i.e., mobile platforms with one or more robotic manipulators, to autonomously adapt to new or changing situations. The developed approaches in this thesis cover the following four topics: (1) learning the robot's kinematic structure and properties using actuation and visual feedback, (2) learning about articulated objects in the environment in which the robot is operating, (3) using tactile feedback to augment the visual perception, and (4) learning novel manipulation tasks from human demonstrations.In the first part of this thesis, we present innovative approaches to learning a robot's own body schema from scratch using visual self-observation. This allows manipulation robots to calibrate themselves automatically and to adapt their body schemata autonomously, for example after hardware failures or during tool use. In the second part, we extend the developed framework to learning about articulated objects -such as doors and drawers -with which service robots often need to interact. The presented algorithms enable robots to learn accurate kinematic models of articulated objects, which in turn allow them to interact with the objects robustly. In the third part, we provide approaches that allow manipulation robots to make use of tactile perception -an ability that is known to play an important role in human object manipulation skills. The main contributions in this part are approaches to identifying objects and to perceiving aspects of their internal states. With this, a manipulation robot can verify that it has grasped the correct object and, for example, discriminate full from empty bottles. Finally, we present an integrated system that allows human operators to intuitively teach a robot novel manipulation tasks by demonstration.All techniques developed in the thesis are based on probabilistic learning and inference. They have been implemented and evaluated on real robots as well as in simulation. Extensive experiments have been conducted to analyze and validate the properties of the developed algorithms and to demonstrate a significant increase in robustness, adaptability, and utility of mobile manipulation robots in everyday life.
Zusammenfassung