Body synchronization between interacting people involves coordinative movements in time, space and form. The introduction of newer technologies for automated video analysis and motion tracking has considerably improved the accurate measurement of coordination, particularly in temporal and spatial terms. However, the form of interpersonal coordination has been less explored. In the present study we address this gap by exploring the effect of trust on temporal and morphological patterns of interpersonal coordination. We adapted an optical motion-capture system to record spontaneous body movements in pairs of individuals engaged in natural conversations. We conducted two experiments in which we manipulated trust through a breach of expectancy (Study 1: 10 trustful and 10 distrustful participants) and friendship (Study 2: 20 dyads of friends and 20 dyads of strangers). In Study 1, results show the participants' strong, early mirror-like coordination in response to the confederates' breach of trust. In Study 2, imitative coordination tended to be more pronounced in pairs of friends than in pairs of non-friends. Overall, our results show not only that listeners move in reaction to speakers, but also that speakers react to listeners with a chain of dynamic coordination patterns affected by the immediate disposition of, and long-term relationship with, their interlocutors.
This study explored the effects of musical improvisation between dyads of same-sex strangers on subsequent behavioural alignment. Participants–all non-musicians–conversed before and after either improvising music together (Musical Improvisation—MI—group) or doing a motoric non-rhythmic cooperative task (building a tower together using wooden blocks; the Hands-Busy—HB—group). Conversations were free, but initially guided by an adaptation of the Fast Friends Questionnaire for inducing talk among students who are strangers and meeting for the first time. Throughout, participants’ motion was recorded with an optical motion-capture system (Mocap) and analysed in terms of speed cross-correlations. Their conversations were also recorded on separate channels using headset microphones and were analysed in terms of the periodicity displayed by rhythmic peaks in the turn transitions across question and answer pairs (Q+A pairs). Compared with their first conversations, the MI group in the second conversations showed: (a) a very rapid, partially simultaneous anatomical coordination between 0 and 0.4 s; (b) delayed mirror motoric coordination between 0.8 and 1.5 s; and (c) a higher proportion of Periodic Q+A pairs. In contrast, the HB group’s motoric coordination changed slightly in timing but not in degree of coordination between the first and second conversations, and there was no significant change in the proportion of periodic Q+A pairs they produced. These results show a convergent effect of prior musical interaction on joint body movement and use of shared periodicity across speech turn-transitions in conversations, suggesting that interaction in music and speech may be mediated by common processes.
Drawing on the notion of musical intervals, recent studies have demonstrated the presence of frequency ratios within human vocalisation. Methodologically, these studies have addressed human vocalisation at a single-individual level. In the present study, we asked whether patterns such as musical intervals are also detected among the voices of people engaging in a conversation as an emerging interpersonal phenomenon. A total of 56 participants were randomly paired and assigned to either a control or a low-trust condition. Frequency ratios were generated by juxtaposing nonlocal fundamental frequency (F0) emissions from two people engaged in each individual dyadic conversation. Differences were found among conditions, both in terms of interval distribution and accuracy. This result supports the idea that psychological dispositions modulate the musical intervals generated between participants through mutual real-time vocal accommodation. These results underscore the socio-intentional dimension of music in vocal pitch interplay.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.