Proceedings of the 7th International Conference on New Interfaces for Musical Expression - NIME '07 2007
DOI: 10.1145/1279740.1279819
|View full text |Cite
|
Sign up to set email alerts
|

Gesture control of sounds in 3D space

Abstract: This paper presents a methodology and a set of tools for gesture control of sources in 3D surround sound. The techniques for rendering acoustic events on multi-speaker or headphone-based surround systems have evolved considerably, making it possible to use them in real-time performances on light equipment. Controlling the placement of sound sources is usually done in idiosyncratic ways and has not yet been fully explored and formalized. This issue is addressed here with the proposition of a methodical approach… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0
1

Year Published

2009
2009
2022
2022

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(11 citation statements)
references
References 4 publications
0
10
0
1
Order By: Relevance
“…Therefore, it seems reasonable to regard the means for spatial sound control used in live musical performance from the perspective of design practice of digital musical instruments (DMIs). This potential link has already been roughly explored in previous research (Wanderley and Orio 2002;Marshall et al 2007;Schacher 2007;Perez-Lopez 2015), with a particular focus on the gestural control paradigm. At the core of the DMI metaphor, as introduced by Miranda and Wanderley (2006), stands the decoupling of the physical interface (input or control device) from the sound generating system (contrasting to the integral concept of acoustic musical instruments).…”
Section: Sound Spatialisation Controllers In Context Of Digital Musicmentioning
confidence: 86%
See 1 more Smart Citation
“…Therefore, it seems reasonable to regard the means for spatial sound control used in live musical performance from the perspective of design practice of digital musical instruments (DMIs). This potential link has already been roughly explored in previous research (Wanderley and Orio 2002;Marshall et al 2007;Schacher 2007;Perez-Lopez 2015), with a particular focus on the gestural control paradigm. At the core of the DMI metaphor, as introduced by Miranda and Wanderley (2006), stands the decoupling of the physical interface (input or control device) from the sound generating system (contrasting to the integral concept of acoustic musical instruments).…”
Section: Sound Spatialisation Controllers In Context Of Digital Musicmentioning
confidence: 86%
“…There have been a few recent attempts to review the evolution of spatialisation controllers from a historical and musicological perspective (e.g., Brech 2015; Brech and Paland 2015). Some authors have explicitly focused on spatialisation interfaces for real time performances of music (Mooney 2005;Johnson et al , 2014a, others have discussed more recent developments of sound spatialisation systems and spatial rendering frameworks (Marshall et al 2007;Perez-Lopez 2015;Peters 2011;Peters et al 2009;Schacher 2007) as the core component of common software solutions for sound spatialisation. By providing a classification system and a first systematic inventory of spatialisation controllers, our contribution aims at providing deeper insight into design and performance practice of spatial sound controller.…”
Section: A Systematic Inventory Of Spatial Sound Controllers For Realmentioning
confidence: 99%
“…In the second, performance movements are used to trigger spatialization as in the case of indirect control of spatialization through dancers' movements [Wijnans 2010]. Schacher [2007] distinguishes the latter two approaches as top-down and bottom-up control of spatialization.…”
Section: Interaction Design Aspectsmentioning
confidence: 99%
“…As in our previous works (Kontogeorgakopoulos, Kotsifa and Erichsen 2011), key motion features such us velocity, presence, position, orientation and acceleration were detected and tracked; in this work, we limited the use of sensors to a single camera-based motion-tracking system. Camera-based motion tracking is not uncommon in interactive music, art and design (Levin 2006; Schacher 2010; Wechsler, Weiss and Dowling 2004; Winkler 1997). Among several existing technological solutions (MAX MSP Jitter with cv.jit library, the Eyesweb platform, the Processing programming language with the openCV library and the openFrameworks framework with the ofxOpenCV), the Eyecon system was utilised.…”
Section: Technologymentioning
confidence: 99%