This article presents recent developments in actuated musical instruments created by the authors, who also describe an ecosystemic model of actuated performance activities that blur traditional boundaries between the physical and virtual elements of musical interfaces. Actuated musical instruments are physical instruments that have been endowed with virtual qualities controlled by a computer in real-time but which are nevertheless tangible. These instruments provide intuitive and engaging new forms of interaction. They are different from traditional (acoustic) and fully automated (robotic) instruments in that they produce sound via vibrating element(s) that are co-manipulated by humans and electromechanical systems. We examine the possibilities that arise when such instruments are played in different performative environments and music-making scenarios, and we postulate that such designs may give rise to new methods of musical performance. The Haptic Drum, the Feedback Resonance Guitar, the Electromagnetically Prepared Piano, the Overtone Fiddle and Teleoperation with Robothands are described, along with musical examples and reflections on the emergent properties of the performance ecologies that these instruments enable. We look at some of the conceptual and perceptual issues introduced by actuated musical instruments, and finally we propose some directions in which such research may be headed in the future.
This article presents a theoretical framework for the design of expressive musical instruments, the Musical Interface Technology Design Space: MITDS. The activities of imagining, designing and building new musical instruments, performing, composing, and improvising with them, and analysing the whole process in an effort to better understand the interface, our physical and cognitive associations with it, and the relationship between performer, instrument and audience can only be seen as an ongoing iterative work-in-progress. It is long-term evolutionary research, as each generation of a new musical instrument requires inventiveness and years of dedication towards the practice and mastery of its performance system (comprising the interface, synthesis and the mappings between them). Many revisions of the system may be required in order to develop musical interface technologies that enable us to achieve truly expressive performances. The MITDS provides a conceptual framework for describing, analysing, designing and extending the interfaces, mappings, synthesis algorithms and performance techniques for interactive musical instruments. It provides designers with a theoretical base to draw upon when creating technologically advanced performance systems, and can be seen as a set of guidelines for analysis, and a taxonomy of design patterns for interactivity in musical instruments. The MITDS focuses mainly on human-centred design approaches to realtime control of the multidimensional parameter spaces in musical composition and performance, where the primary objective is to close the gap between human gestures and complex synthesis methods.
Self-resonating vibrotactile instruments (SRIs) are hybrid feedback instruments, characterised by an electro-mechanical feedback loop that is both the means of sound production and the expressive interface. Through the lens of contemporary SRIs, we reflect on how they are characterised, designed, and played. By considering reports from designers and players of this species of instrument-performance system, we explore the experience of playing them. With a view to supporting future research and practice in the field, we illustrate the value of conceptualising SRIs in Cybernetic and systems theoretic terms and suggest that this offers an intuitive, yet powerful basis for future performance, analysis and making; in doing so we close the loop in the making, playing and conceptualisation of SRIs with the aim of nourishing the evolution of theory, creative and technical practice in this field.
Music performance provides an elaborate research test bed of subtle and complex gestural interactions among members of a performance group. To develop a paradigm that allows performers to interact as naturally and subtly with automated digital systems as they do with other human performers, an interface design must allow performers to play their instruments untethered, using only natural cues and body language to control computer information.This article presents a multimodal system for gesture recognition in untethered interactive flute performance. Using computer vision, audio analysis, and electric-field sensing, a performer's discrete cues
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.