Previous studies have linked brain oscillation and timing, with evidence suggesting that alpha oscillations (10 Hz) may serve as a “sample rate” for the visual system. However, direct manipulation of alpha oscillations and time perception has not yet been demonstrated. To test this, we had eighteen human subjects perform a time generalization task with visual stimuli. Additionally, we had previously recorded resting-state EEG from each subject and calculated their Individual Alpha Frequency (IAF), estimated as the peak frequency from the mean spectrum over posterior electrodes between 8 and 13 Hz. Participants first learned a standard interval (600 ms) and were then required to judge if a new set of temporal intervals were equal or different compared to that standard. After learning the standard, participants performed this task while receiving occipital transcranial Alternating Current Stimulation (tACS). Crucially, for each subject, tACS was administered at their IAF or at off-peak alpha frequencies (IAF ± 2 Hz). Results demonstrated a linear shift in the psychometric function indicating a modification of perceived duration, such that progressively “faster” alpha stimulation led to longer perceived intervals. These results provide the first evidence that direct manipulations of alpha oscillations can shift perceived time in a manner consistent with a clock speed effect.
Previous studies have linked brain oscillation and timing, with evidence suggesting that alpha oscillations (10Hz) may serve as a “sample rate” for the visual system. However, direct manipulation of alpha oscillations and time perception has not yet been demonstrated. Eighteen subjects performed a time generalization task with visual stimuli. Participants first learned the standard intervals (600 ms) and then were required to judge the new temporal intervals if they were equal or different compared to the standard. Additionally, we had previously recorded resting-state EEG from each subject and calculated their Individual Alpha Frequency (IAF), estimated as the peak frequency from the mean spectrum over posterior electrodes between 8 and 13 Hz. After learning the standard interval, participants performed the time generalization task while receiving occipital transcranial Alternating Current Stimulation (tACS). Crucially, for each subject, tACS was administered at their IAF or at off-peak alpha frequencies (IAF±2 Hz). Results demonstrated a linear shift in the psychometric function indicating a modification of perceived duration, such that progressively “faster” alpha stimulation led to longer perceived intervals. These results provide the first evidence that direct manipulations of alpha oscillations can shift perceived time in a manner consistent with a clock speed effect.
In order to navigate through the environment, humans must be able to measure both the distance traveled in space, and the interval covered in time. Yet, how these two dimensions are computed and interact across neural systems remains unknown. One possibility is that subjects measure how far and how long they have traveled relative to a known reference point, or anchor. To measure this, we had human participants (n=24) perform a distance estimation task in a virtual environment in which they were cued to attend to either the spatial or temporal interval traveled, while responses were measured with multiband fMRI. We observed that both dimensions evoked similar frontoparietal networks, yet with striking rostrocaudal dissociation between temporal and spatial estimation. Multivariate classifiers trained on each dimension were further able to predict the temporal or spatial interval traveled, with center of activation within the supplementary motor area (SMA) and retrosplenial cortex (RSC) for time and space, respectively. Further, a cross-classification approach revealed the right supramarginal gyrus (SMG) and occipital place area (OPA) as regions capable of decoding the general magnitude of traveled distance. Altogether, our findings suggest the brain uses separate systems for tracking spatial and temporal distance, which are combined together along with amodal estimates.
There have been significant advances in biosignal extraction techniques to drive external biomechatronic devices or to use as inputs to sophisticated human machine interfaces. The control signals are typically derived from biological signals such as myoelectric measurements made either from the surface of the skin or subcutaneously. Other biosignal sensing modalities are emerging. With improvements in sensing modalities and control algorithms, it is becoming possible to robustly control the target position of a end effector. It remains largely unknown to what extent these improvements can lead to naturalistic human-like movement. In this paper, we sought to answer this question. We utilized a sensing paradigm called sonomyography based on continuous ultrasound imaging of forearm muscles. Unlike myoelectric control strategies which measure electrical activation and use the extracted signals to determine the velocity of an end-effector; sonomyography measures muscle deformation directly with ultrasound and uses the extracted signals to proportionally control the position of an end-effector. Previously, we showed that users were able to accurately and precisely perform a virtual target acquisition task using sonomyography. In this work, we investigate the time course of the control trajectories derived from sonomyography. We show that the time course of the sonomyography-derived trajectories that users take to reach virtual targets reflect the trajectories shown to be typical for kinematic characteristics observed in biological limbs. Specifically, during a target acquisition task, the velocity profiles followed a minimum jerk trajectory shown for point-to-point arm reaching movements, with similar time to target. In addition, the trajectories based on ultrasound imaging result in a systematic delay and scaling of peak movement velocity as the movement distance increased. We believe this is the first evaluation of similarities in control policies in coordinated movements in jointed limbs, and those based on position control signals extracted at the individual muscle level. These results have strong implications for the future development of control paradigms for assistive technologies.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.