Gestural expressiveness plays a fundamental role in the interaction with people, environments, animals, things, and so on. Thus, several emerging application domains would exploit the interpretation of movements to support their critical designing processes. To this end, new forms to express the people’s perceptions could help their interpretation, like in the case of music. In this paper, we investigate the user’s perception associated with the interpretation of sounds by highlighting how sounds can be exploited for helping users in adapting to a specific environment. We present a novel algorithm for mapping human movements into MIDI music. The algorithm has been implemented in a system that integrates a module for real-time tracking of movements through a sample based synthesizer using different types of filters to modulate frequencies. The system has been evaluated through a user study, in which several users have participated in a room experience, yielding significant results about their perceptions with respect to the environment they were immersed.