2011 International Conference on Collaboration Technologies and Systems (CTS) 2011
DOI: 10.1109/cts.2011.5928689
|View full text |Cite
|
Sign up to set email alerts
|

Enabling humanoid musical interaction and performance

Abstract: ABSTRACT

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
6
0

Year Published

2012
2012
2024
2024

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 11 publications
0
6
0
Order By: Relevance
“…Neither of these systems, however, consider mood. The RoboCup Junior competition includes a 'dancing' event, in which robots dance to a piece of music 2 . The music is known in advance and played from file, though, so musical features can be marked by hand instead of being extracted from the music.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Neither of these systems, however, consider mood. The RoboCup Junior competition includes a 'dancing' event, in which robots dance to a piece of music 2 . The music is known in advance and played from file, though, so musical features can be marked by hand instead of being extracted from the music.…”
Section: Related Workmentioning
confidence: 99%
“…Hubo is an adult-sized humanoid developed by the Korean Advanced Institute for Science and Technology (KAIST). We have used Hubo in several other tasks involving robotic reactions to audio, such as moving in synchrony with audio beats [2]. Hubo possesses over forty degrees of freedom and can perform smooth and graceful motions such as tai chi, so it is a suitable choice for a robot that will need to move in human-like ways and display emotion.…”
Section: Introductionmentioning
confidence: 99%
“…Ye et al developed co-play algorithm to play a vibraphone [10] and Takuma et al developed two-level synchronization algorithm for coplay [11]. Moreover, Youngmoo et al developed a humanoid for musical interaction [12]. Takeshi et al investigated human-robot ensemble with a humanoid thereminist [13], and developed an algorithm for gesture recognition and beat tracking [14].…”
Section: Introductionmentioning
confidence: 99%
“…We have performed substantial prior work on enabling robots to step or dance in response to music (Grunberg, Lofaro, Oh, & Kim, 2011;Kim et al, 2010). These performances, however, only considered the beat locations and tempo of the audio, and disregarded other aspects.…”
Section: Introductionmentioning
confidence: 99%