With a view to studying the development of social relationships between humans and robots, it is our contention that music can help provide extended engagement and open ended interaction. In this paper we explore the effectiveness of music as a mode of engagement using Mortimer, a robot able to play a drum kit and employing a composition algorithm to respond to a human pianist. We used this system to conduct a comparative study into the effects of presenting the robot as a social actor or as an instrument. Using automated behavioural metrics, including face tracking, we found that participants in the social actor condition played for longer uninterrupted and stopped the robot mid-performance less. They also spent more time looking at the robot when not playing and less time looking at the piano when playing. We suggest these results indicate greater fluency of playing and engagement and more feelings of social presence towards the robot when presented as a social actor The 23rd IEEE International Symposium on Robot and Human Interactive Communication
It is our hypothesis that improvised musical interaction will be able to provide the extended engagement often failing others during long term Human Robot Interaction (HRI) trials. Our previous work found that simply framing sessions with their drumming robot Mortimer as social interactions increased both social presence and engagement, two factors we feel are crucial to developing and maintaining a positive and meaningful relationship between human and robot. For this study we investigate the inclusion of the additional social modalities, namely head pose and facial expression, as nonverbal behaviour has been shown to be an important conveyor of information in both social and musical contexts. Following a 6 week experimental study using automatic behavioural metrics, results demonstrate those subjected to nonverbal behaviours not only spent more time voluntarily with the robot, but actually increased the time they spent as the trial progressed. Further, that they interrupted the robot less during social interactions and played for longer uninterrupted. Conversely, they also looked at the robot less in both musical and social contexts. We take these results as support for open ended musical activity providing a solid grounding for human robot relationships and the improvement of this by the inclusion of appropriate nonverbal behaviours.
Social relationships between humans and robots require both long term engagement and a feeling of believabilty or social presence towards the robot. It is our contention that music can provide the extended engagement that other open-ended interaction studies have failed to do, also, that in combination with the engaging musical interaction, the addition of simulated social behaviours is necessary to trigger this sense of believability or social presence. Building on previous studies with our robot drummer Mortimer that show including social behaviours can increase engagement and social presence, we present the results of a longitudinal study investigating the effect of extending weekly collocated musical improvisation sessions by making Mortimer an active member of the participant's virtual social network. Although we found the effects of extending the relationship into the virtual world were less pronounced than results we have previously found by adding social modalties to human-robot musical interaction, interesting questions are raised about the interpretation of our automated behavioural metrics across different contexts. Further, we found repeated results of increasingly uninteruppted playing and notable differences in responses to online posts by Mortimer and posts by participant's human friends.
In this article, we present research on customizing a variational autoencoder (VAE) neural network to learn models and play with musical rhythms encoded within a latent space. The system uses a data structure that is capable of encoding rhythms in simple and compound meter and can learn models from little training data. To facilitate the exploration of models, we implemented a visualizer that relies on the dynamic nature of the pulsing rhythmic patterns. To test our system in real-life musical practice, we collected small-scale datasets of contemporary music genre rhythms and trained models with them. We found that the non-linearities of the learned latent spaces coupled with tactile interfaces to interact with the models were very expressive and lead to unexpected places in composition and live performance musical settings. A music album was recorded and it was premiered at a major music festival using the VAE latent space on stage.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.