Gait analysis is used widely in clinical practice to evaluate abnormal gait caused by disease. Conventionally, medical professionals use motion capture systems or make visual observations to evaluate a patient's gait. Recent biomedical engineering studies have proposed easy-to-use gait analysis methods employing wearable sensors with inertial measurement units (IMUs). IMUs placed on the shanks just above the ankles allow for long-term gait monitoring because the participant can walk with or without shoes during the analysis. To the knowledge of the authors, no IMU-based gait analysis method has been reported that estimates stride length, gait speed, stride duration, stance duration, and swing duration simultaneously. In the present study, we tested a proposed gait analysis method that uses IMUs attached on the shanks to estimate foot trajectory and temporal gait parameters. Our proposed method comprises two steps: stepwise dissociation of continuous gait data into multiple steps and three-dimensional trajectory estimation from data obtained from accelerometers and gyroscopes. We evaluated this proposed method by analyzing the gait of 19 able-bodied participants (mean age 23.9 years, 9 men and 10 women). Wearable sensors were attached on the participants' shanks, and we measured three-axis acceleration and three-axis angular velocity with the sensors to estimate foot trajectory during walking. We compared gait parameters estimated from the foot trajectory obtained with the proposed method and those measured with a motion capture system. Mean accuracy (± standard deviation) was 0.054 ± 0.031 m for stride length, 0.034 ± 0.039 m/s for gait speed, 0.002 ± 0.020 s for stride duration, 0.000 ± 0.017 s for stance duration, and 0.002 ± 0.024 s for swing duration. These results suggest that the proposed method is suitable for gait analysis, whereas there is a room for improvement of its accuracy and further development of this IMU-based gait analysis method will enable us to use such systems for clinical gait analysis.
Adopting an unusual posture can sometimes give rise to paradoxical experiences. For example, the subjective ordering of successive unseen tactile stimuli delivered to the two arms can be affected when people cross them. A growing body of evidence now highlights the role played by the parietal cortex in spatio-temporal information processing when sensory stimuli are delivered to the body or when actions are executed; however, little is known about the neural basis of such paradoxical feelings resulting from such unusual limb positions. Here, we demonstrate increased fMRI activation in the left posterior parietal cortex when human participants adopted a crossed hands posture with their eyes closed. Furthermore, by assessing tactile temporal order judgments (TOJs) in the same individuals, we observed a positive association between activity in this area and the degree of reversal in TOJs resulting from crossing arms. The strongest positive association was observed in the left intraparietal sulcus. This result implies that the left posterior parietal cortex may be critically involved in monitoring limb position and in spatio-temporal binding when serial events are delivered to the limbs.
The ownership of one's body parts represents a fundamental aspect of self-consciousness. Accumulating empirical evidence supports the existence of this concept in humans and nonhuman primates, but it is unclear whether nonprimate mammals experience similar feelings. Therefore, the present study used rubber tails to investigate body ownership in rodents. When the real tails and rubber tails were synchronously stroked, the mice responded as if their own tails were touched when the rubber tails were grasped. In contrast, when the stimuli were delivered asynchronously, there was a significantly lower mean response rate when the rubber tail was grasped. These findings suggest that mice may experience body ownership of their tails, suggestive of the rubber hand illusion in humans.
Gaze-independent brain computer interfaces (BCIs) are a potential communication tool for persons with paralysis. This study applies affective auditory stimuli to investigate their effects using a P300 BCI. Fifteen able-bodied participants operated the P300 BCI, with positive and negative affective sounds (PA: a meowing cat sound, NA: a screaming cat sound). Permuted stimuli of the positive and negative affective sounds (permuted-PA, permuted-NA) were also used for comparison. Electroencephalography data was collected, and offline classification accuracies were compared. We used a visual analog scale (VAS) to measure positive and negative affective feelings in the participants. The mean classification accuracies were 84.7% for PA and 67.3% for permuted-PA, while the VAS scores were 58.5 for PA and −12.1 for permuted-PA. The positive affective stimulus showed significantly higher accuracy and VAS scores than the negative affective stimulus. In contrast, mean classification accuracies were 77.3% for NA and 76.0% for permuted-NA, while the VAS scores were −50.0 for NA and −39.2 for permuted NA, which are not significantly different. We determined that a positive affective stimulus with accompanying positive affective feelings significantly improved BCI accuracy. Additionally, an ALS patient achieved 90% online classification accuracy. These results suggest that affective stimuli may be useful for preparing a practical auditory BCI system for patients with disabilities.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.