Computing technologies have opened up a myriad of possibilities for expanding the sonic capabilities of acoustic musical instruments. Musicians nowadays employ a variety of rather inexpensive, wireless sensor-based systems to obtain refined control of interactive musical performances in actual musical situations like live music concerts. It is essential though to clearly understand the capabilities and limitations of such acquisition systems and their potential influence on high-level control of musical processes. In this study, we evaluate one such system composed of an inertial sensor (MetaMotionR) and a hexaphonic nylon guitar for capturing strumming gestures. To characterize this system, we compared it with a high-end commercial motion capture system (Qualisys) typically used in the controlled environments of research laboratories, in two complementary tasks: comparisons of rotational and translational data. For the rotations, we were able to compare our results with those that are found in the literature, obtaining RMSE below 10° for 88% of the curves. The translations were compared in two ways: by double derivation of positional data from the mocap and by double integration of IMU acceleration data. For the task of estimating displacements from acceleration data, we developed a compensative-integration method to deal with the oscillatory character of the strumming, whose approximative results are very dependent on the type of gestures and segmentation; a value of 0.77 was obtained for the average of the normalized covariance coefficients of the displacement magnitudes. Although not in the ideal range, these results point to a clearly acceptable trade-off between the flexibility, portability and low cost of the proposed system when compared to the limited use and cost of the high-end motion capture standard in interactive music setups.
The paper presents a real-time tool for the segmentation and analysis of body gestures, part of a larger se- tup for exploring music and dance in the contexts of electroacoustic composition, live-electronics and other interac- tive performances. The idea of gesture is the foundation of the proposed interactive strategies, and is discussed from different points of view. The current implementation uses the Max/Msp programming language and Kinect sensors. The segmentation of dance gestures is based on the inspection of the zero-crossings of the acceleration curve of ea- ch body joint. Concepts from Laban Movement Analysis are used to qualify the extracted gestures. Dance improvi- sations on Petrushka excerpts are the basis of a case study, where the relations between the music (tempo, pulses,instrumentation, character) and Laban Basic Actions are stressed.
GuiaRT is an interactive musical setup based on a nylon-string guitar equipped with hexaphonic piezoelectric pickups. It consists of a modular set of real-time tools for the symbolic transcription, variation, and triggering of selected segments during a performance, as well as some audio processing capabilities. Its development relied on an iterative approach, with distinct phases dedicated to feature extraction, transcriptions, and creative use. This article covers the motivations for this augmented instrument and several details of its implementation, including the hardware and strategies for identifying the most typical types of sound produced on a nylon-string guitar, as well as tools for symbolic musical transformations. This acoustic–digital interface was primarily designed for interactive exploration, and it has also been effectively used in performance analyses and as a pedagogical tool.
Pierre Schaeffer's typomorphology (1966) proposes seven criteria of musical perception for the identification and qualification of sound objects, which form the basis of his musical theory. This Solfège fits well into contexts where pitch is not the dominant dimension. Relying on similarities between the practice of reduced listening and the utilization of low-level audio descriptors, we present the first version of a real-time setup in which these descriptors are applied to qualify percussive sounds. The paper describes the tools and strategies used for addressing different criteria: envelope followers with different window sizes and filtering; detection of transients and amplitude modulations; extraction and counting of spectral components; estimation of intrinsic dissonance and spectral distribution; among others. The extracted data is subjected to simple statistical analysis, producing scalar values associated with each segmented object. Finally, we present a variety of examples.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.