In this paper, we present a new approach to realize whole-body tactile interactions with a self-organizing, multi-modal artificial skin on a humanoid robot. We, therefore, equipped the whole upper body of the humanoid HRP-2 with various patches of CellulARSkin -a modular artificial skin. In order to automatically handle a potentially high number of tactile sensor cells and motors units, the robot uses open-loop exploration motions, and distributed accelerometers in the artificial skin cells, to acquire its self-centered sensory-motor knowledge. This body self-knowledge is then utilized to transfer multi-modal tactile stimulations into reactive body motions. Tactile events provide feedback on changes of contact on the whole-body surface. We demonstrate the feasibility of our approach on a humanoid, here HRP-2, grasping large and unknown objects only via tactile feedback. Kinesthetically taught grasping trajectories, are reactively adapted to the size and stiffness of different test objects. Our paper contributes the first realization of a self-organizing tactile sensor-behavior mapping on a full-sized humanoid robot, enabling a position controlled robot to compliantly handle objects.