2010
DOI: 10.1017/s1355771810000300
|View full text |Cite
|
Sign up to set email alerts
|

Spatial Sound Synthesis in Computer-Aided Composition

Abstract: In this article we describe our ongoing research and development efforts towards integrating the control of sound spatialisation in computer-aided composition. Most commonly, the process of sound spatialisation is separated from the world of symbolic computation. We propose a model in which spatial sound rendering is regarded as a subset of sound synthesis, and spatial parameters are treated as abstract musical materials within a global compositional framework. The library OMPrisma is presented, which implemen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2011
2011
2019
2019

Publication Types

Select...
5
1
1

Relationship

3
4

Authors

Journals

citations
Cited by 9 publications
(8 citation statements)
references
References 22 publications
0
8
0
Order By: Relevance
“…In these tasks, the standard mouse-based interaction techniques separate the 3D positional control, but there has been interest in advancing this for more integral interaction techniques. A popular method in 3D graphics is to use the mouse and ray-casting techniques (Van Emmerik, 1990;Schumacher and Bresson, 2010;Bresson and Schumacher, 2011), but this is not widely used in consumer based 3D audio production tools. Available 3D audio software tools, such as 3DC Lib in OpenMusic (Bresson and Schumacher, 2011), Spat library for Max (Carpentier, 2015), and Spatial Audio Workstation 1 , provide multiple 2D planes that represent the same 3D space, allowing users to compose and edit 3D audio trajectories.…”
Section: Related Workmentioning
confidence: 99%
“…In these tasks, the standard mouse-based interaction techniques separate the 3D positional control, but there has been interest in advancing this for more integral interaction techniques. A popular method in 3D graphics is to use the mouse and ray-casting techniques (Van Emmerik, 1990;Schumacher and Bresson, 2010;Bresson and Schumacher, 2011), but this is not widely used in consumer based 3D audio production tools. Available 3D audio software tools, such as 3DC Lib in OpenMusic (Bresson and Schumacher, 2011), Spat library for Max (Carpentier, 2015), and Spatial Audio Workstation 1 , provide multiple 2D planes that represent the same 3D space, allowing users to compose and edit 3D audio trajectories.…”
Section: Related Workmentioning
confidence: 99%
“…In the OpenMusic computer-aided composition environment, a number of tools and libraries exist that allow the integration of the algorithmic generation of trajectories to compositional processes and offline spatial audio rendering [Bresson, 2012]. In particular, the OMPrisma library provides a compositional toolbox featuring algorithmically driven manipulations of spatial parameters [Schumacher and Bresson, 2010]. These tools lack real-time feedback and monitoring of dynamic aspects, since the composers have to convert the graphical scores into control data or sounds before hearing and assessing any audio results.…”
Section: Spatial Authoringmentioning
confidence: 99%
“…One involves extending real-time environments with time-oriented structures [Schnell et al, 2009, Agostini andGhisi, 2013]. The other one consists of integrating audio processing engines within offline computer-aided composition frameworks [Laurson et al, 2005, Schumacher and Bresson, 2010. In this paper we aim at blending these two solutions so that composers can explore and assess their ideas though real-time interaction while specifying the temporal aspects of digital sound processing.…”
mentioning
confidence: 99%
“…In particular, the integration of the CHANT synthesizer (Rodet, Potard, and Barrière 1984) in this environment shall emphasize interesting issues about how this continuous conception can be tackled in OMChroma. Current research has also focused on sound spatialization and the introduction of spatial rendering in the OMChroma framework (Schumacher and Bresson 2010).…”
Section: )mentioning
confidence: 99%