In the last decade, we have witnessed a drastic change in the form factor of audio and vision technologies, from heavy and grounded machines to lightweight devices that naturally fit our bodies. However, only recently, haptic systems have started to be designed with wearability in mind. The wearability of haptic systems enables novel forms of communication, cooperation, and integration between humans and machines. Wearable haptic interfaces are capable of communicating with the human wearers during their interaction with the environment they share, in a natural and yet private way. This paper presents a taxonomy and review of wearable haptic systems for the fingertip and the hand, focusing on those systems directly addressing wearability challenges. The paper also discusses the main technological and design challenges for the development of wearable haptic interfaces, and it reports on the future perspectives of the field. Finally, the paper includes two tables summarizing the characteristics and features of the most representative wearable haptic systems for the fingertip and the hand.
BackgroundOur body schema gives the subjective impression of being highly stable. However, a number of easily-evoked illusions illustrate its remarkable malleability. In the rubber-hand illusion, illusory ownership of a rubber-hand is evoked by synchronous visual and tactile stimulation on a visible rubber arm and on the hidden real arm. Ownership is concurrent with a proprioceptive illusion of displacement of the arm position towards the fake arm. We have previously shown that this illusion of ownership plus the proprioceptive displacement also occurs towards a virtual 3D projection of an arm when the appropriate synchronous visuotactile stimulation is provided. Our objective here was to explore whether these illusions (ownership and proprioceptive displacement) can be induced by only synchronous visuomotor stimulation, in the absence of tactile stimulation.Methodology/Principal FindingsTo achieve this we used a data-glove that uses sensors transmitting the positions of fingers to a virtually projected hand in the synchronous but not in the asynchronous condition. The illusion of ownership was measured by means of questionnaires. Questions related to ownership gave significantly larger values for the synchronous than for the asynchronous condition. Proprioceptive displacement provided an objective measure of the illusion and had a median value of 3.5 cm difference between the synchronous and asynchronous conditions. In addition, the correlation between the feeling of ownership of the virtual arm and the size of the drift was significant.Conclusions/SignificanceWe conclude that synchrony between visual and proprioceptive information along with motor activity is able to induce an illusion of ownership over a virtual arm. This has implications regarding the brain mechanisms underlying body ownership as well as the use of virtual bodies in therapies and rehabilitation.
This paper proposes a new multimodal architecture for gaze-independent brain-computer interface (BCI)-driven control of a robotic upper limb exoskeleton for stroke rehabilitation to provide active assistance in the execution of reaching tasks in a real setting scenario. At the level of action plan, the patient's intention is decoded by means of an active vision system, through the combination of a Kinect-based vision system, which can online robustly identify and track 3-D objects, and an eye-tracking system for objects selection. At the level of action generation, a BCI is used to control the patient's intention to move his/her own arm, on the basis of brain activity analyzed during motor imagery. The main kinematic parameters of the reaching movement (i.e., speed, acceleration, and jerk) assisted by the robot are modulated by the output of the BCI classifier so that the robot-assisted movement is performed under a continuous control of patient's brain activity. The system was experimentally evaluated in a group of three healthy volunteers and four chronic stroke patients. Experimental results show that all subjects were able to operate the exoskeleton movement by BCI with a classification error rate of 89.4±5.0% in the robot-assisted condition, with no difference of the performance observed in stroke patients compared with healthy subjects. This indicates the high potential of the proposed gaze-BCI-driven robotic assistance for neurorehabilitation of patients with motor impairments after stroke since the earliest phase of recover
This paper presents a novel electromyography (EMG)-driven hand exoskeleton for bilateral rehabilitation of grasping in stroke. The developed hand exoskeleton was designed with two distinctive features: (a) kinematics with intrinsic adaptability to patient's hand size, and (b) free-palm and free-fingertip design, preserving the residual sensory perceptual capability of touch during assistance in grasping of real objects. In the envisaged bilateral training strategy, the patient's non paretic hand acted as guidance for the paretic hand in grasping tasks. Grasping force exerted by the non paretic hand was estimated in real-time from EMG signals, and then replicated as robotic assistance for the paretic hand by means of the hand-exoskeleton. Estimation of the grasping force through EMG allowed to perform rehabilitation exercises with any, non sensorized, graspable objects. This paper presents the system design, development, and experimental evaluation. Experiments were performed within a group of six healthy subjects and two chronic stroke patients, executing robotic-assisted grasping tasks. Results related to performance in estimation and modulation of the robotic assistance, and to the outcomes of the pilot rehabilitation sessions with stroke patients, positively support validity of the proposed approach for application in stroke rehabilitation.
Background and Purpose-Although there is strong evidence on the beneficial effects of virtual reality (VR)-based rehabilitation, it is not yet well understood how the different aspects of these systems affect recovery. Consequently, we do not exactly know what features of VR neurorehabilitation systems are decisive in conveying their beneficial effects. Methods-To specifically address this issue, we developed 3 different configurations of the same VR-based rehabilitation system, the Rehabilitation Gaming System, using 3 different interface technologies: vision-based tracking, haptics, and a passive exoskeleton. Forty-four patients with chronic stroke were randomly allocated to one of the configurations and used the system for 35 minutes a day for 5 days a week during 4 weeks. Results-Our results revealed significant within-subject improvements at most of the standard clinical evaluation scales for all groups. Specifically we observe that the beneficial effects of VR-based training are modulated by the use/nonuse of compensatory movement strategies and the specific sensorimotor contingencies presented to the user, that is, visual feedback versus combined visual haptic feedback. Conclusions-Our findings suggest that the beneficial effects of VR-based neurorehabilitation systems such as the Rehabilitation Gaming System for the treatment of chronic stroke depend on the specific interface systems used. These results have strong implications for the design of future VR rehabilitation strategies that aim at maximizing functional outcomes and their retention. Clinical Trial Registration-This trial was not registered because it is a small clinical study that evaluates the feasibility of prototype devices.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.