“…Commissioned and supported by the British Science Association (Liggett et al 2017 ), the Manchester Science Festival, and Arts Council England, the work combined interactive music and visuals, motion tracking, haptic feedback, and online audience participation, with its primary aim being the development of a system where the performer’s choreography can be influenced directly through generated data, mirroring the way motion tracking technologies are used toward controlling sound, visuals, lights, and any other type of media that can be affected by digital data (Siegel, 2009 ). The authors decided on exploring this concept in response to the “general orthodoxy” that interactive performance is based upon, which in the case of music-based interaction follows a model of “gesture → sensor → sound = musical expression” (Salter et al, 2008 , p. 249), where “dancers are able to have a direct effect on the music during a performance” (Siegel, 2009 , p. 193). With the choreography generating data able to affect digital media in a multitude of manners, dancers can only utilize the resulting audiovisual material as a way of affecting the choreography through their own hearing and vision.…”