Guide to Computing for Expressive Music Performance 2012
DOI: 10.1007/978-1-4471-4123-5_2
|View full text |Cite
|
Sign up to set email alerts
|

Systems for Interactive Control of Computer Generated Music Performance

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(8 citation statements)
references
References 46 publications
0
8
0
Order By: Relevance
“…2) Synthesized Music: Real-time synthesis approaches make it easy to exert control over a larger set of musical parameters and more easily craft more complex DMIs [49]. Sonification parameters used in these designs include musical pitch [50]- [52], tempo [36], brightness [51], mix balance [51], chord arpeggio characteristics [53], musical layer richness [54], synthetic tone additions [54], and percussive sample triggering [55].…”
Section: Musical Biofeedbackmentioning
confidence: 99%
“…2) Synthesized Music: Real-time synthesis approaches make it easy to exert control over a larger set of musical parameters and more easily craft more complex DMIs [49]. Sonification parameters used in these designs include musical pitch [50]- [52], tempo [36], brightness [51], mix balance [51], chord arpeggio characteristics [53], musical layer richness [54], synthetic tone additions [54], and percussive sample triggering [55].…”
Section: Musical Biofeedbackmentioning
confidence: 99%
“…Table 3 shows several feedback and conductor models. For a more thorough review of feedback models we refer the reader to Fabiani et al (2013). Common ways for a user to control certain aspects of a performance are either via high-level semantic descriptors that describe the intended expressive character-often selected from some 2D space related to Russell (1980)'s valence-arousal plane (Friberg, 2006;Canazza et al, 2015); or via physical gestures, measured either through motion capture (Fabiani, 2011) or by using physical interfaces (Chew et al, 2005;Dixon et al, 2005;Baba et al, 2010).…”
Section: Conductor Systemsmentioning
confidence: 99%
“…Further in this text we elaborate on musical gesture, redefining it as a concept beyond bodily movement. Previous to that we build on the latter three parts of Wanderley's proposal, which have been often presented as an electronic musical instrument's main constituting modules (Wessel & Wright, 2002;Hunt, Kirk & Neighbour, 2004;Armstrong, 2006;Magnusson, 2010;Fabiani, Friberg & Bresin, 2013;de Campo, 2014).…”
Section: Generalized Model Of Musical Machinesmentioning
confidence: 99%
“…As the interaction within a gestural agency system demands action to be taken in order to reach a musical goal, agents are necessarily presented with challenges. It has been argued that a musical instrument needs to pose a balanced challenge that is neither too much to be frustrating nor too little to be unappealing in order to be interesting (Wanderley & Orio, 2002;Wessel & Wright, 2002;Levitin, McAdams & Adams, 2002;McDermott et al, 2013;Fabiani, Friberg & Bresin, 2013). This notion conforms to the concept of flow, defined as an optimal state of wellbeing achieved by an activity that provides challenges or opportunities for action accommodating the skill level of an individual (Nakamura & Csikszentmihalyi, 2014).…”
Section: Gestural Agencymentioning
confidence: 99%