Abstract. We present a HMM based system for real-time gesture analysis. The system outputs continuously parameters relative to the gesture time progression and its likelihood. These parameters are computed by comparing the performed gesture with stored reference gestures. The method relies on a detailed modeling of multidimensional temporal curves. Compared to standard HMM systems, the learning procedure is simplified using prior knowledge allowing the system to use a single example for each class. Several applications have been developed using this system in the context of music education, music and dance performances and interactive installation. Typically, the estimation of the time progression allows for the synchronization of physical gestures to sound files by time stretching/compressing audio buffers or videos.
We present an ensemble of tangible objects and software modules designed for musical interaction and performance. The tangible interfaces form an ensemble of connected objects communicating wirelessly. A central concept is to let users determine the final musical function of the objects, favoring customization, assembling, repurposing. This might imply assembling the wireless interfaces with existing everyday objects or musical instruments. Moreover, gesture analysis and recognition modules allow the users to define their own action/motion for the control of sound parameters. Various sound engines and interaction scenarios were built and experimented. Some examples that were developed in a music pedagogy context are described.
This paper presents Gesture Interaction DEsigner (GIDE), an innovative application for gesture recognition. Instead of recognizing gestures only after they have been entirely completed as happens in classic gesture recognition systems, GIDE exploits the full potential of gestural interaction by tracking gestures continuously and synchronously so allowing users to both control the target application moment-to-moment and also receive immediate and synchronous feedback about system recognition states. By this means, they quickly learn how to interact with the system in order to develop better performances. Furthermore, rather than learning the pre-defined gestures of others, GIDE allows users to design their own gestures so making interaction more natural and also allowing the applications to be tailored by users' specific needs. We describe our system that demonstrates these new qualities-that combine to provide fluid gesture interaction design-through evaluations with a range of performers and artists.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.