Abstract-In this paper, we present an end-to-end approach transforming multi-modal tactile signals into a compliant control to generate different dynamic robot behaviors. This is obtained by fusing multi-modal sensor signals from our artificial skin and joint sensors with different control approaches. One advantage of these compliant behaviors is to produce safer robots, especially for physical Human-Robot Interaction. A key component of our framework is a robot parametric modeling based on the artificial skin multi-modal sensors (proximity, force and acceleration). These generated models are used to control a robot improving and even changing its dynamic behavior. We validate our framework in a real wheeled humanoid robot, where our presented framework enables a stiff robotic system to be compliant and react to multi-modal tactile events (pre-contacts and contacts).