Mathematics can help analyze the arts and inspire new artwork. Mathematics can also help make transformations from one artistic medium to another, considering exceptions and choices, as well as artists' individual and unique contributions. We propose a method based on diagrammatic thinking and quantum formalism. We exploit decompositions of complex forms into a set of simple shapes, discretization of complex images, and Dirac notation, imagining a world of "prototypes" that can be connected to obtain a fine or coarse-graining approximation of a given visual image. Visual prototypes are exchanged with auditory ones, and the information (position, size) characterizing visual prototypes is connected with the information (onset, duration, loudness, pitch range) characterizing auditory prototypes. The topic is contextualized within a philosophical debate (discreteness and comparison of apparently unrelated objects), it develops through mathematical formalism, and it leads to programming, to spark interdisciplinary thinking and ignite creativity within STEAM.
In this paper we present the rationale and design for two systems (developed by the Integra Lab research group at Birmingham Conservatoire) implementing a common approach to interactive visualisation of the spatial position of 'sound-objects'. The first system forms part of the AHRCfunded project 'Transforming Transformation: 3D Models for Interactive Sound Design', which entails the development of a new interaction model for audio processing whereby sound can be manipulated through grasp as if it were an invisible 3D object. The second system concerns the spatial manipulation of 'beatboxer' vocal sound using handheld mobile devices through alreadylearned physical movement. In both cases a means to visualise the spatial position of multiple sound sources within a 3D 'stereo image' is central to the system design, so a common model for this task was therefore developed. This paper describes the ways in which sound and spatial information are implemented to meet the practical demands of these systems, whilst relating this to the wider context of extant, and potential future methods for spatial audio visualisation. Digital art. Mobile applications. Music. Performing arts. Technologies.
This paper presents a method for mapping embodied gesture, acquired with electromyography and motion sensing, to a corpus of small sound units, organised by derived timbral features using concatenative synthesis. Gestures and sounds can be associated directly using individual units and static poses, or by using a sound tracing method that leverages our intuitive associations between sound and embodied movement. We propose a method for augmenting corporal density to enable expressive variation on the original gesture-timbre space.
The goal of our research is to provide harpists with the tools to control and transform the sounds of their instrument in a natural and musical way. We consider the development of music with live electronics, with particular reference to the harp repertoire, and include interviews with six harpists that use technology in their professional performance practice. We then present HarpCI, a case study that explores how gestures can be used to control and transform sound and light projection in live performance with the electric harp. HarpCI draws on research from the areas Human Computer Interaction (HCI) and Music Interaction Design (MiXD) to extend the creative possibilities available to the performer, and demonstrates our approach to bridging the gap between the performer/composer and the harp on one side, and the technology on the other. We discuss the use of guitar pedals with the electric harp, and the limitations they impose, and then introduce the MyoSpat system as a potential solution to this issue. MyoSpat aims to give musicians control over auditory and visual aspects of the performance through easy to learn, intuitive and natural hand gestures. It also aims to enhance the compositional process for instrument and live electronics, through a new way of music notation for gesturally controlled interactive systems. The system uses the Myo® armband gestural controller, a device to control live sound processing that is non-invasive to instrumental technique and performer. The combination of these elements allows the performer to experience a tangible connection between gesture and sound production.
The EVA London Research Workshop is one of the more unique elements of our Conference that we have been keen to develop over many years. Often Postgraduate Students, at Masters or PhD level, or Unaffiliated Artists may feel excluded from prestigious conferences until their research is complete and they can submit a full paper proposal. However apart from their Tutors, Supervisors and Mentors, EVA London provides an almost unique opportunity to submit projects which can be truly described as 'Work in Progress'. With an audience of International Academics and acknowledged experts in a field, the Research Workshop presentations have often lead to very positive interest and support, and sometimes to future collaborations, or returning to EVA London a year later with a completed piece of Research and a successful Full Conference Proposal. In previous years the presentations have been very popular with our delegates and as Chair of the Research Workshops I can think of a number of occasions where an audience question that began "Have you thought of…" has lead to very exciting new lines of discovery. Sadly in 2020 we will miss that particular interaction, however, we have, as always selected an exciting, ground breaking and quite eclectic group of RW delegates. We hope that by publishing their papers here, either grouped together, around themes, or individually published, you will be keen to contact our RW authors to discuss and develop ideas, as if we had all been able to meet up together in July 2020. Digital art. Visual art. Sound and music. Photography. Gender.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.