Over the last five years, KIMA, an art and research project on sound and vision, has investigated visual properties of sound. Previous iterations of KIMA focused on digital representations of cymatics—physical sound patterns—as media for performance. The most recent development incorporated neural networks and machine learning strategies to explore visual expressions of sound in participatory music creation. The project, displayed on a 360-degree canvas at the London Roundhouse, prompted the audience to explore their own voice as intelligent, real-time visual representation. Machine learning algorithms played a key role in meaningful interpretation of sound as visual form. The resulting immersive performance turned the audience into cocreators of the piece.
KIMA is a holographic surround-sound installation that visualizes telepresence as both a phonetic and a synaesthetic phenomenon. The performance piece is based on the physical conditions of cymatics-the study of physically visible sound wave patterns. Two environments, a quad surround and a holographic interface, build the framework of a telematic experience that illustrates communication as wave forms while focusing on the relationship between sound and matter.
The Voice propose a sonic and visual composition as act of co-creation, an 'open work' to which everyone can contribute. We invite participants to experience tonal harmonies between one another. With KIMA: The Voice, we are looking to embed means and ways of measuring audience participation during creative engagement.
KIMA: Noise is a participatory art piece inviting audiences to explore impact of urban noises interactively. Using specific urban sound sources, the audience experiences noise as spatial soundscapes, responding to it, physically engaging and interacting with it. KIMA: Noise creates awareness for the phenomenon of noise pollution. The paper looks at preeminent research in the field, and draws conclusions of how sound affects us as individuals. The art project KIMA: Noise is introduced technically and conceptually.Visual sound. Participatory art. Visual sound installation. Noise pollution.
Transmission is both a telepresence performance and a research project. As a real-time visualization tool, Transmission creates alternate representations of neural activity through sound and vision, investigating the effect of interaction on human consciousness. As a sonification project, it creates an immersive experience for two users: a soundscape created by the human mind and the influence of kinetic interaction. An electroencephalographic (EEG) headset interprets a user’s neural activity. An Open Sound Control (OSC) script then translates this data into a real-time particle stream and sound environment at one end. A second user in a remote location modifies this stream in real time through body movement. Together they become a telematic musical interface-communicating through visual and sonic representation of their interactions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.