Personalized exoskeleton assistance provides users with the largest improvements in walking speed1 and energy economy2–4 but requires lengthy tests under unnatural laboratory conditions. Here we show that exoskeleton optimization can be performed rapidly and under real-world conditions. We designed a portable ankle exoskeleton based on insights from tests with a versatile laboratory testbed. We developed a data-driven method for optimizing exoskeleton assistance outdoors using wearable sensors and found that it was equally effective as laboratory methods, but identified optimal parameters four times faster. We performed real-world optimization using data collected during many short bouts of walking at varying speeds. Assistance optimized during one hour of naturalistic walking in a public setting increased self-selected speed by 9 ± 4% and reduced the energy used to travel a given distance by 17 ± 5% compared with normal shoes. This assistance reduced metabolic energy consumption by 23 ± 8% when participants walked on a treadmill at a standard speed of 1.5 m s−1. Human movements encode information that can be used to personalize assistive devices and enhance performance.
This paper presents Stanford Doggo, a quasidirect-drive quadruped capable of dynamic locomotion. This robot matches or exceeds common performance metrics of stateof-the-art legged robots. In terms of vertical jumping agility, a measure of average vertical speed, Stanford Doggo matches the best performing animal and surpasses the previous best robot by 22%. An overall design architecture is presented with focus on our quasi-direct-drive design methodology. The hardware and software to replicate this robot is open-source, requires only hand tools for manufacturing and assembly, and costs less than $3000.
Objective: Analyzing human motion is essential for diagnosing movement disorders and guiding rehabilitation for conditions like osteoarthritis, stroke, and Parkinson s disease. Optical motion capture systems are the standard for estimating kinematics, but the equipment is expensive and requires a predefined space. While wearable sensor systems can estimate kinematics in any environment, existing systems are generally less accurate than optical motion capture. Many wearable sensor systems require a computer in close proximity and use proprietary software, limiting experimental reproducibility. Methods: Here, we present OpenSenseRT, an open-source and wearable system that estimates upper and lower extremity kinematics in real time by using inertial measurement units and a portable microcontroller. Results: We compared the OpenSenseRT system to optical motion capture and found an average RMSE of 4.4 degrees across 5 lower-limb joint angles during three minutes of walking and an average RMSE of 5.6 degrees across 8 upper extremity joint angles during a Fugl-Meyer task. The open-source software and hardware are scalable, tracking 1 to 14 body segments, with one sensor per segment. A musculoskeletal model and inverse kinematics solver estimate Kinematics in real-time. The computation frequency depends on the number of tracked segments, but is sufficient for real-time measurement for many tasks of interest; for example, the system can track 7 segments at 30 Hz in real-time. The system uses off-the-shelf parts costing approximately $100 USD plus $20 for each tracked segment. Significance: The OpenSenseRT system is validated against optical motion capture, low-cost, and simple to replicate, enabling movement analysis in clinics, homes, and freeliving settings.
Physical inactivity is the fourth leading cause of global mortality. Health organizations have requested a tool to objectively measure physical activity. Respirometry and doubly labeled water accurately estimate energy expenditure, but are infeasible for everyday use. Smartwatches are portable, but have significant errors. Existing wearable methods poorly estimate time-varying activity, which comprises 40% of daily steps. Here, we present a Wearable System that estimates metabolic energy expenditure in real-time during common steady-state and time-varying activities with substantially lower error than state-of-the-art methods. We perform experiments to select sensors, collect training data, and validate the Wearable System with new subjects and new conditions for walking, running, stair climbing, and biking. The Wearable System uses inertial measurement units worn on the shank and thigh as they distinguish lower-limb activity better than wrist or trunk kinematics and converge more quickly than physiological signals. When evaluated with a diverse group of new subjects, the Wearable System has a cumulative error of 13% across common activities, significantly less than 42% for a smartwatch and 44% for an activity-specific smartwatch. This approach enables accurate physical activity monitoring which could enable new energy balance systems for weight management or large-scale activity monitoring.
This paper presents the Tact hand-an anthropomorphic, open-source, myoelectric prosthetic hand that was designed for use by people with transradial amputations in developing countries. This hand matches or exceeds the performance of other state-of-the-art myoelectric prosthetic hands, but costs two orders of magnitude less ($250) and is easy to manufacture with a 3D printer and off-the-shelf parts. We describe our design process, evaluate the Tact hand with both qualitative and quantitative measures of performance, and show examples of using this hand to grasp household objects.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.