The context of this work is to develop, adapt and integrate augmented reality related tools to enhance the emotion involved in cultural performances. Part of the work was dedicated to augmenting a stage in a live performance, with dance as an application case. In this paper, we present a milestone of this work, an augmented dance show that brings together several tools and technologies that were developed over the project's lifetime. This is the result of mixing an artistic process with scientific research and development. This augmented show brings to stage issues from the research fields of Human-Machine Interaction (HMI) and Augmented Reality (AR). Virtual elements are added on stage (visual and audio) and the dancer is able to interact with them in real-time, using different interaction techniques. The originality of this work is threefold. Firstly, we propose a set of movement-based interaction techniques that can be used independently on stage or in another context. In this set, some techniques are direct, while others go through a high level of abstraction. Namely, we performed movement-based emotion recognition on the dancer, and used the recognized emotions to generate emotional music pieces and emotional poses for a humanoid robot. Secondly, those interaction techniques rely on various interconnected systems that can be reassembled. We hence propose an integrated, interactive system for augmenting a live performance, a context where system failure is not tolerated. The final system can be adapted following the artist's preferences. Finally, those systems were validated through an on field experiment -the show itself -after which we gathered and analyzed the feedback from both the audience and the choreographer.