A sonification is a rendering of audio in response to data, and is used in instances where visual representations of data are impossible, difficult, or unwanted. Designing sonifications often requires knowledge in multiple areas as well as an understanding of how the end users will use the system. This makes it an ideal candidate for end-user development where the user plays a role in the creation of the design. We present a model for sonification that utilizes user-specified examples and data to generate cross-domain mappings from data to sound. As a novel contribution we utilize soundscapes (acoustic scenes) for these user-selected examples to define a structure for the sonification. We demonstrate a proof of concept of our model using sound examples and discuss how we plan to build on this work in the future.
In this article, we describe methods and consequences for giving audience members interactive control over the real-time sonification of performer movement data in electronic music performance. We first briefly describe how to technically implement a musical performance in which each audience member can interactively construct and change their own individual sonification of performers' movements, heard through headphones on a personal WiFi-enabled device, while also maintaining delay-free synchronization between performer movements and sound. Then, we describe two studies we conducted in the context of live musical performances with this technology. These studies have allowed us to examine how providing audience members with the ability to interactively sonify performer actions impacted their experiences, including their perceptions of their own role and engagement with the performance. These studies also allowed us to explore how audience members with different levels of expertise with sonification and sound, and different motivations for interacting, could be supported and influenced by different sonification interfaces. This work contributes to a better understanding of how providing interactive control over sonification may alter listeners' experiences, of how to support everyday people in designing and using bespoke sonifications, and of new possibilities for musical performance and participation.
In this paper, we explore the potential for everyday Twitter users to design and use soundscape sonifications as an alternative, “calm” modality for staying informed of Twitter activity. We first present the results of a survey assessing how 100 Twitter users currently use and change audio notifications. We then present a study in which 9 frequent Twitter users employed two user interfaces— with varying degrees of automation—to design, customize, and use soundscape sonifications of Twitter data. This work suggests that soundscapes have great potential for creating a calm technol ogy for maintaining awareness of Twitter data, and that sound scapes can be useful in helping people without prior experience in sound design think about sound in sophisticated ways and engage meaningfully in sonification design.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.