Humans interact with other humans at a variety of timescales and in a variety of social contexts. We exhibit patterns of coordination that may differ depending on whether we are genuinely interacting as part of a coordinated group of individuals vs merely co-existing within the same physical space. Moreover, the local coordination dynamics of an interacting pair of individuals in an otherwise non-interacting group may spread, propagating change in the global coordination dynamics and interaction of an entire crowd. Dynamical systems analyses, such as Recurrence Quantification Analysis (RQA), can shed light on some of the underlying coordination dynamics of multi-agent human interaction. We used RQA to examine the coordination dynamics of a performance of “Welcome to the Imagination World”, composed for wind orchestra. This performance enacts a real-life simulation of the transition from uncoordinated, non-interacting individuals to a coordinated, interacting multi-agent group. Unlike previous studies of social interaction in musical performance which rely on different aspects of video and/or acoustic data recorded from each individual, this project analyzes group-level coordination patterns solely from the group-level acoustic data of an audio recording of the performance. Recurrence and stability measures extracted from the audio recording increased when musicians coordinated as an interacting group. Variability in these measures also increased, indicating that the interacting ensemble of musicians were able to explore a greater variety of behavior than when they performed as non-interacting individuals. As an orchestrated (non-emergent) example of coordination, we believe these analyses provide an indication of approximate expected distributions for recurrence patterns that may be measurable before and after truly emergent coordination.
In this article, we review recent advances in research on rhythm and musical beat perception, focusing on the role of predictive processes in auditory motor interactions. We suggest that experimental evidence of the motor system's role in beat perception, including in passive listening, may be explained by the generation and maintenance of internal predictive models, concordant with the Active Inference framework of sensory processing. We highlight two complementary hypotheses for the neural underpinnings of rhythm perception: The Action Simulation for Auditory Prediction hypothesis (Patel and Iversen, 2014) and the Gradual Audiomotor Evolution hypothesis (Merchant and Honing, 2014) and review recent experimental progress supporting each of these hypotheses. While initial formulations of ASAP and GAE explain different aspects of beat-based timing-the involvement of motor structures in the absence of movement, and physical entrainment to an auditory beat respectively-we suggest that work under both hypotheses provide converging evidence toward understanding the predictive role of the motor system in the perception of rhythm, and the specific neural mechanisms involved. We discuss future experimental work necessary to further evaluate the causal neural mechanisms underlying beat and rhythm perception.
It is now widely accepted that the brunt of animal communication is conducted via several modalities, e.g. acoustic and visual, either simultaneously or sequentially. This is a laudable multimodal turn relative to traditional accounts of temporal aspects of animal communication which have focused on a single modality at a time. However, the fields that are currently contributing to the study of multimodal communication are highly varied, and still largely disconnected given their sole focus on a particular level of description or their particular concern with human or non-human animals. Here, we provide an integrative overview of converging findings that show how multimodal processes occurring at neural, bodily, as well as social interactional levels each contribute uniquely to the complex rhythms that characterize communication in human and non-human animals. Though we address findings for each of these levels independently, we conclude that the most important challenge in this field is to identify how processes at these different levels connect. This article is part of the theme issue ‘Synchrony and rhythm interaction: from the brain to behavioural ecology’.
It is now widely known that much animal communication is conducted over several modalities, e.g., acoustic and visual, either simultaneously or sequentially. Despite this awareness, students of synchrony and rhythm interaction in animal communication have traditionally focused on a single modality at a time in their analyses. This paper reviews our current knowledge of non-human and human rhythm interaction in multi-modal communication at several levels of analysis. We begin by examining how multimodality can be defined as a scale-free phenomenon. Then we review which communicative rhythms are sustained by unique characteristics of processes at several levels. We consider communicative rhythms that are sustained by neural-cognitive, peripheral bodily, and social interactive constraints. Potential adaptiveness of multi-modal interaction, both within and between individuals, can be inferred.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.