2017
DOI: 10.3389/fdigh.2017.00009
|View full text |Cite
|
Sign up to set email alerts
|

Extracting Coarse Body Movements from Video in Music Performance: A Comparison of Automated Computer Vision Techniques with Motion Capture Data

Abstract: The measurement and tracking of body movement within musical performances can provide valuable sources of data for studying interpersonal interaction and coordination between musicians. The continued development of tools to extract such data from video recordings will offer new opportunities to research musical movement across a diverse range of settings, including field research and other ecological contexts in which the implementation of complex motion capture (MoCap) systems is not feasible or affordable. S… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
31
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 23 publications
(31 citation statements)
references
References 24 publications
0
31
0
Order By: Relevance
“…However, these systems have mainly been utilized with a few markers for tracking the coordination of specific body parts, such as hand positions of dyads ( Gueugnon et al, 2016 ; Gorman et al, 2017 ), the position of peoples’ heads during joint tasks ( Kijima et al, 2017 ), coordination of finger motion between dyads ( Oullier et al, 2008 ; Fine et al, 2015 ), coordination of foot movements between dyads ( Vesper et al, 2013 ) and coordination of heads and hands between dyads ( Dammeyer and Køppe, 2013 ). To a lesser extent, optical motion capture systems have been utilized for tracking upper body movements by placing more than one or two infrared sensors on people performing joint tasks ( Varlet et al, 2011 ; Llobera et al, 2016 ; Preissmann et al, 2016 ; Stevanovic et al, 2017 ), playing music together ( Ragert et al, 2013 ; Glowinski et al, 2015 ; Jakubowski et al, 2017 ) or moving together to the rhythm of music ( Burger et al, 2014 ; Toiviainen et al, 2014 ). Analysis of interpersonal coordination has been mainly performed by the same non-linear methods mentioned above for data obtained through the other motion tracking systems.…”
Section: Motion Tracking Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…However, these systems have mainly been utilized with a few markers for tracking the coordination of specific body parts, such as hand positions of dyads ( Gueugnon et al, 2016 ; Gorman et al, 2017 ), the position of peoples’ heads during joint tasks ( Kijima et al, 2017 ), coordination of finger motion between dyads ( Oullier et al, 2008 ; Fine et al, 2015 ), coordination of foot movements between dyads ( Vesper et al, 2013 ) and coordination of heads and hands between dyads ( Dammeyer and Køppe, 2013 ). To a lesser extent, optical motion capture systems have been utilized for tracking upper body movements by placing more than one or two infrared sensors on people performing joint tasks ( Varlet et al, 2011 ; Llobera et al, 2016 ; Preissmann et al, 2016 ; Stevanovic et al, 2017 ), playing music together ( Ragert et al, 2013 ; Glowinski et al, 2015 ; Jakubowski et al, 2017 ) or moving together to the rhythm of music ( Burger et al, 2014 ; Toiviainen et al, 2014 ). Analysis of interpersonal coordination has been mainly performed by the same non-linear methods mentioned above for data obtained through the other motion tracking systems.…”
Section: Motion Tracking Methodsmentioning
confidence: 99%
“…Recent publications have provided overviews of empirical findings ( Rennung and Göritz, 2016 ; Vicaria and Dickens, 2016 ) as well as methods for capturing and analyzing patterns of coordinated movement ( Rein, 2016 ; Chetouani et al, 2017 ; Jakubowski et al, 2017 ). For example, Vicaria and Dickens (2016) presented an exhaustive meta-analysis of interpersonal coordination outcomes, and Rennung and Göritz (2016) did the same with the prosocial consequences of interpersonal synchrony.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…2 OF is a standard computer vision technique that performs two-dimensional movement tracking on video data by estimating the apparent velocities of objects. The EyesWeb implementation of OF that was used in this study is based on the algorithm of Farnebäck [ 63 ] and has been validated for use in movement tracking in music performance using a diverse range of video-recorded materials (with different camera angles, instruments, performer positions and clothing) in Jakubowski et al [ 51 ]; for an application of OF in studying movement coordination in conversation see also [ 14 ]. For each video, two regions of interest (ROIs) were manually selected that corresponded to the upper body region of each performer.…”
Section: Experiments 1: Predicting Visual Bouts Of Interaction From Momentioning
confidence: 99%
“…Video recordings are a non-invasive and inexpensive alternative that can be collected in a wide variety of real-world settings, from music festivals and gigs in nightclubs to cross-cultural field research. The work presented here makes use of automated computer vision techniques, which have been validated for use in tracking ancillary movements of musical performers from video [ 51 ]. Specifically, this method allows for the quantification of gross body movements, such as body sway and head nods, which have been implicated as key sources of co-performer interaction in previous work [ 34 , 39 ].…”
Section: Introductionmentioning
confidence: 99%