In this work, we propose a gesture-based language to allow humans to interact with robots using their body in a natural way. We have created a new gesture detection model using neural networks and a new dataset of humans making a collection of body gestures to train this architecture. Furthermore, we compare body gesture communication with other communication channels to demonstrate the importance of adding this knowledge to robots. The presented approach is validated in diverse simulations and real-life experiments with non-trained volunteers. This attains promising results and establishes that it is a valuable framework for social robotic applications, such as human robot collaboration or human-robot interaction.