Upper-limb prostheses users lack proprioception of their artificial arm, and rely heavily on vision to understand its configuration. With the goal of reducing the amount of energy expended on visual cues during upper-limb prosthesis use, this study investigates whether haptic feedback can relay the configuration of a virtual hand in the absence of sight. Two mappings from waistbelt-mounted tactor vibration patterns to hand configuration are explored: (1) Synergy-based hand motions derived from the results of a principal component analysis run on an aggregate of hand motions and (2) Decoupled hand motions, which include experimenter-selected motions such as finger grasp and finger spread. Results show that users can identifY complex hand configurations with vibrotactile feedback patterns based on both the Synergies and Decoupled methods, although 30-45 seconds are required to achieve this task. Also, findings demonstrate that users are likely to memorize correspondence between an overall feeling of the tactor pattern to a hand configuration rather than constructing the hand configuration by isolating and considering each tactor individually. Last, results indicate that hand configuration is most accurately conveyed by maximizing information along a synergy-based space.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.