We address issues of synchronization to rhythms of musical complexity. In two experiments, synchronization to simple and more complex rhythmic sequences was investigated. Experiment 1 examined responses to phase and tempo perturbations within simple, structurally isochronous sequences, presented at different base rates. Experiment 2 investigated responses to similar perturbations embedded within more complex, metrically structured sequences; participants were explicitly instructed to synchronize at different metrical levels (i.e., tap at different rates to the same rhythmic patterns) on different trials. We found evidence that (1) the intrinsic tapping frequency adapts in response to temporal perturbations in both simple (isochronous) and complex (metrically structured) rhythms, (2) people can synchronize with unpredictable, metrically structured rhythms at different metrical levels, with qualitatively different patterns of synchronization seen at higher versus lower levels of metrical structure, and (3) synchronization at each tapping level reflects information from other metrical levels. The latter finding provides evidence for a dynamic and flexible internal representation of the sequence's metrical structure.
How does a baseball outfielder know where to run to catch a fly ball? The “outfielder problem” remains unresolved, and its solution would provide a window into the visual control of action. It may seem obvious that human action is based on an internal model of the physical world, such that the fielder predicts the landing point based on a mental model of the ball’s trajectory (TP). But two alternative theories, Optical Acceleration Cancellation (OAC) and Linear Optical Trajectory (LOT), propose that fielders are led to the right place at the right time by coupling their movements to visual information in a continuous “online” manner. All three theories predict successful catches and similar running paths. We provide a critical test by using virtual reality to perturb the vertical motion of the ball in mid-flight. The results confirm the predictions of OAC, but are at odds with LOT and TP.
In studies of rhythmic coordination, where sensory information is often generated by an auditory stimulus, spatial and temporal variability are known to decrease at points in the movement cycle coincident with the stimulus, a phenomenon known as anchoring (Byblow et al. 1994). Here we hypothesize that the role of anchoring may be to globally stabilize coordination under conditions in which it would otherwise undergo a global coordinative change such as a phase transition. To test this hypothesis, anchoring was studied in a bimanual coordination paradigm in which either inphase or antiphase coordination was produced as auditory pacing stimuli (and hence movement frequency) were scaled over a wide range of frequencies. Two different anchoring conditions were used: a single-metronome condition, in which peak amplitude of right finger flexion coincided with the auditory stimulus; and a double-metronome condition, in which each finger reversal (flexion and extension) occurred simultaneously with the auditory stimuli. Anchored reversal points displayed lower spatial variation than unanchored reversal points, resulting in more symmetric phase plane trajectories in the double- than the single-metronome condition. The global coordination dynamics of the double-metronome condition was also more stable, with transitions from antiphase to inphase occurring less often and at higher movement frequencies than in the single-metronome condition. An extension of the Haken-Kelso-Bunz model of bimanual coordination is presented briefly which includes specific coupling of sensory information to movement through a process we call parametric stabilization. The parametric stabilization model provides a theoretical account of both local effects on the individual movement trajectories (anchoring) and global stabilization of observed coordination patterns, including the delay of phase transitions.
Immersive virtual environments are a promising research tool for the study of perception and action, on the assumption that visual-motor behavior in virtual and real environments is essentially similar. We investigated this issue for locomotor behavior and tested the generality of Fajen and Warren's [2003] steering dynamics model. Participants walked to a stationary goal while avoiding a stationary obstacle in matched physical and virtual environments. There were small, but reliable, differences in locomotor paths, with a larger maximum deviation ( = 0.16 m), larger obstacle clearance ( = 0.16 m), and slower walking speed ( = 0.13 m/s) in the virtual environment. Separate model fits closely captured the mean virtual and physical paths (R 2 > 0.98). Simulations implied that the path differences are not because of walking speed or a 50% distance compression in virtual environments, but might be a result of greater uncertainty about the egocentric location of virtual obstacles. On the other hand, paths had similar shapes in the two environments with no difference in median curvature and could be modeled with a single set of parameter values (R 2 > 0.95). Fajen and Warren's original parameters successfully generalized to new virtual and physical object configurations (R 2 > 0.95). These results justify the use of virtual environments to study locomotor behavior.
WE INVESTIGATED PEOPLES’ ABILITY TO ADAPT TO THE fluctuating tempi of music performance. In Experiment 1, four pieces from different musical styles were chosen, and performances were recorded from a skilled pianist who was instructed to play with natural expression. Spectral and rescaled range analyses on interbeat interval time-series revealed long-range (1/f type) serial correlations and fractal scaling in each piece. Stimuli for Experiment 2 included two of the performances from Experiment 1, with mechanical versions serving as controls. Participants tapped the beat at ¼- and ⅛-note metrical levels, successfully adapting to large tempo fluctuations in both performances. Participants predicted the structured tempo fluctuations, with superior performance at the ¼-note level. Thus, listeners may exploit long-range correlations and fractal scaling to predict tempo changes in music.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.