Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences 2017
DOI: 10.1145/3123514.3123556
|View full text |Cite
|
Sign up to set email alerts
|

Deep Models for Ensemble Touch-Screen Improvisation

Abstract: For many, the pursuit and enjoyment of musical performance goes hand-in-hand with collaborative creativity, whether in a choir, jazz combo, orchestra, or rock band. However, few musical interfaces use the affordances of computers to create or enhance ensemble musical experiences. One possibility for such a system would be to use an artificial neural network (ANN) to model the way other musicians respond to a single performer. Some forms of music have well-understood rules for interaction; however, this is not … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2018
2018
2018
2018

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 8 publications
0
1
0
Order By: Relevance
“…The Neural Touchscreen Ensemble [51], a system developed by the authors, is an RNN-driven simulation of a touchscreen ensemble experience. A human performer plays freely improvised music on a touchscreen and an ensemble performance is continually played back on three RNN-controlled touchscreen devices in response.…”
Section: The Neural Touchscreen Ensemblementioning
confidence: 99%
“…The Neural Touchscreen Ensemble [51], a system developed by the authors, is an RNN-driven simulation of a touchscreen ensemble experience. A human performer plays freely improvised music on a touchscreen and an ensemble performance is continually played back on three RNN-controlled touchscreen devices in response.…”
Section: The Neural Touchscreen Ensemblementioning
confidence: 99%