In this study, we developed a low-cost simulated testbed of a physically interactive virtual reality (VR) system and evaluated its efficacy as an occupational virtual trainer for human-robot collaborative (HRC) tasks. The VR system could be implemented in industrial training applications for sensorimotor skill acquisitions and identifying potential task-, robot-, and human-induced hazards in the industrial environments. One of the challenges in designing and implementing such simulation testbed is the effective integration of virtual and real objects and environment, including human movement biomechanics. Therefore, this study aimed to compare the movement kinematics (joint angles) and kinetics (center of pressure) of the human participants while performing pick-and-place lifting tasks with and without using a physically interactive VR testbed. Results showed marginal differences in human movement kinematics and kinetics between real and virtual environment tasks, suggesting the effective transfer of training benefits from VR to real-life situations.
In this study, we evaluated the user experience of a physically interactive virtual reality (VR) system, which was developed to provide passive kinesthetic haptics in order to enhance motor learning functions during an occupational virtual training. We compared the user experience (e.g., perceived ease-of-use, ease-of-learning, and usefulness) and the functional workload between real and virtual environments by simulating a pick-and-place lifting task in both environments. Results showed an increase in user experience— such as ease of use and learning and usefulness—and a decrease in the overall functional workload with the progression of successive VR tasks.
This study aimed to evaluate the visual-spatiotemporal coregistration of the real and virtual objects’ movement dynamics in a low-cost physics-based virtual reality (VR) system that provides real cutaneous and kinesthetic haptic feedback of the objects instead of computer-generated haptic feedback. Twelve healthy participants performed three human-robot collaborative (HRC) sequential pick-and-place lifting tasks while both motion capture and VR systems respectively traced the movement kinematics of the real and virtual objects. We used an iterative closest point algorithm to transform the 3D spatial point clouds of the VR system into the motion capture system. We employed a novel algorithm and principal component analysis to respectively calculate visual and spatiotemporal coregistration precisions between virtual and real objects. Results showed a high correlation (r > 0.96) between real and virtual objects’ movement dynamics and linear and angular coregistration errors of less than 5 cm and 8°, respectively. The trend also revealed a low temporal registration error of <12 ms and was only found along the vertical axis. The visual registration data indicated that the real cutaneous and kinesthetic haptics provided by the physical objects in the virtual environment enhanced proprioception and visuomotor functions of the users.
Recent advancements in VR technology facilitate
tracking real-world objects and users' movements in the virtual environment
(VE) and inspire researchers to develop a physics-based haptic system (i.e.,
real object haptics) instead of computer-generated haptic feedback. However,
there is limited research on the efficacy of such VR systems in enhancing
operators’ sensorimotor learning for tasks that require high motor and physical
demands. Therefore, this study aimed to design and evaluate the efficacy of a
physics-based virtual reality (VR) system that provides users realistic
cutaneous and kinesthetic haptic feedback. We designed a physics-based VR
system, named PhyVirtual, and simulated human-robot collaborative (HRC)
sequential pick-and-place lifting tasks in the VE. Participants performed the
same tasks in the real environment (RE) with human-human collaboration instead
of human-robot collaboration. We used a custom-designed questionnaire, the
NASA-TLX, and electromyography activities from biceps, middle and anterior
deltoid muscles to determine user experience, workload, and neuromuscular
dynamics, respectively. Overall, the majority of responses (>65%)
demonstrated that the system is easy-to-use, easy-to-learn, and effective in
improving motor skill performance. While compared to tasks performed in the RE,
the PhyVirtual system placed significantly lower physical demand (124.90%; p
< 0.05) on the user. The electromyography data exhibited similar trends (p
> 0.05; r > 0.89) for both environments. These results show that the
PhyVirtual system is an effective tool to simulate safe human-robot
collaboration commonly seen in many modern warehousing settings. Moreover, it can
be used as a viable replacement for live sensorimotor training in a wide range
of fields.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.