Signals derived from the rat motor cortex can be used for controlling one-dimensional movements of a robot arm. It remains unknown, however, whether real-time processing of cortical signals can be employed to reproduce, in a robotic device, the kind of complex arm movements used by primates to reach objects in space. Here we recorded the simultaneous activity of large populations of neurons, distributed in the premotor, primary motor and posterior parietal cortical areas, as non-human primates performed two distinct motor tasks. Accurate real-time predictions of one- and three-dimensional arm movement trajectories were obtained by applying both linear and nonlinear algorithms to cortical neuronal ensemble activity recorded from each animal. In addition, cortically derived signals were successfully used for real-time control of robotic devices, both locally and through the Internet. These results suggest that long-term control of complex prosthetic robot arm movements can be achieved by simple real-time transformations of neuronal population signals derived from multiple cortical areas in primates.
The biomechanics of skin and underlying tissues plays a fundamental role in the human sense of touch. It governs the mechanics of contact between the skin and an object, the transmission of the mechanical signals through the skin, and their transduction into neural signals by the mechanoreceptors. To better understand the mechanics of touch, it is necessary to establish quantitative relationships between the loads imposed on the skin by an object, the state of stresses/strains at mechanoreceptor locations, and the resulting neural response. Towards this goal, 3-D finite-element models of human and monkey fingertips with realistic external geometries were developed. By computing fingertip model deformations under line loads, it was shown that a multi-layered model was necessary to match previously obtained in vivo data on skin surface displacements. An optimal ratio of elastic moduli of the layers was determined through numerical experiments whose results were matched with empirical data. Numerical values of the elastic moduli of the skin layers were obtained by matching computed results with empirically determined force-displacement relationships for a variety of indentors. Finally, as an example of the relevance of the model to the study of tactile neural response, the multilayered 3-D finite-element model was shown to be able to predict the responses of the slowly adapting type I (SA-I) mechanoreceptors to indentations by complex object shapes.
Investigating virtual environments has become an increasingly interesting research topic for engineers, computer and cognitive scientists, and psychologists. Although there have been several recent studies focused on the development of multimodal virtual environments (VEs) to study human-machine interactions, less attention has been paid to human-human and human-machine interactions in shared virtual environments (SVEs), and to our knowledge, no attention paid at all to what extent the addition of haptic communication between people would contribute to the shared experience. We have developed a multimodal shared virtual environment and performed a set of experiments with human subjects to study the role of haptic feedback in collaborative tasks and whether haptic communication through force feedback can facilitate a sense of being and collaborating with a remote partner. The study concerns a scenario where two participants at remote sites must co-operate to perform a joint task in a SVE. The goals of the study are (1) to assess the impact of force feedback on task performance, (2) to better understand the role of haptic communication in human-human interactions, (3) to study the impact of touch on the subjective sense of collaborating with a human as reported by the participants based on what they could see and feel, and (4) to investigate if gender, personality, or emotional experiences of users can affect haptic communication in SVEs. The outcomes of this research can have a powerful impact on the development of next generation human-computer interfaces and network protocols that integrate touch and force feedback technology into the Internet, development of protocols and techniques for collaborative teleoperation such as hazardous material removal, space station repair, and remote surgery, and enhancement of virtual environments for performing collaborative tasks in shared virtual worlds on a daily basis such as co-operative teaching, training, planning and design, cybergames, and social gatherings. Our results suggest that haptic feedback significantly improves the task performance and contributes to the feeling of 'sense of togetherness' in SVEs. In addition, the results show that the experience of visual feedback only at first, and then subsequently visual plus haptic feedback elicits a better performance than presentation of visual plus haptic feedback first followed by visual feedback only.
Abstract-Robust manipulation and insertion of small parts can be challenging because of the small tolerances typically involved. The key to robust control of these kinds of manipulation interactions is accurate tracking and control of the parts involved. Typically, this is accomplished using visual servoing or force-based control. However, these approaches have drawbacks. Instead, we propose a new approach that uses tactile sensing to accurately localize the pose of a part grasped in the robot hand. Using a feature-based matching technique in conjunction with a newly developed tactile sensing technology known as GelSight that has much higher resolution than competing methods, we synthesize high-resolution height maps of object surfaces. As a result of these high-resolution tactile maps, we are able to localize small parts held in a robot hand very accurately. We quantify localization accuracy in benchtop experiments and experimentally demonstrate the practicality of the approach in the context of a small parts insertion problem.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.