This paper presents a human-scale virtual environment (VE) with haptic feedback along with two experiments performed in the context of product design. The user interacts with a virtual mock-up using a large-scale bimanual string-based haptic interface called SPIDAR (Space Interface Device for Artificial Reality). An original self-calibration method is proposed. A vibro-tactile glove was developed and integrated to the SPIDAR to provide tactile cues to the operator. The purpose of the first experiment was: (1) to examine the effect of tactile feedback in a task involving reach-and-touch of different parts of a digital mock-up, and (2) to investigate the use of sensory substitution in such tasks. The second experiment aimed to investigate the effect of visual and auditory feedback in a car-light maintenance task. Results of the first experiment indicate that the users could easily and quickly access and finely touch the different parts of the digital mock-up when sensory feedback (either visual, auditory, or tactile) was present. Results of the of the second experiment show that visual and auditory feedbacks improve average placement accuracy by about 54 % and 60% respectively compared to the open loop case
This paper studies the benefits that haptic feedback can provide in performing complex maintenance tasks using virtual mock-ups. We have carried out user study that consisted on two experiments where participants had to perform an accessibility task. A human-scale string-based haptic interface was used to provide the operator with haptic stimuli. A prop was used to provide grasp feedback. A mocap system tracks user's hand and head movements while a 5DT data-glove is used to measure finger flexion. In the first experiment the effect of haptic (collision) and visual feedback are investigated. In the second experiment we investigated the effect of haptic guidance on operator performance. The results were analyzed in terms of task completion time and collision avoidance. Experiments show that haptic stimuli proved to be more efficient than visual ones. In addition, haptic guidance helped the operators to correct trajectories and hence improve their performance.
This paper presents a methodology for the efficient integration of CAD models in a physical-based virtual reality simulation that provides the user with multi-modal feedback. User interacts with virtual mock-up using a string-based haptic interface. Hand tracking is realized using a motion capture system. Stereoscopic images are displayed on a 2m x 2.5m retro-projected screen and viewed using polarized glasses. The proposed methodology implemented in a low-cost system, has been validated through an experimental study. Six participants were instructed to remove a car lamp from the virtual mockup and replace it in correct position. A prop was used to provide local haptic sensation related to the car lamp. Three experimental conditions were tested concerning sensory feedback from collisions: (1) no feedback (graphics only), (2) visual feedback and (3) haptic feedback. Results show that visual and haptic feedback allowed to increase performance, as compared with the open-loop case (no feedback), by respectively 17.8 % and 35.2 %.
We present a methodology for both the efficient integration and dexterous manipulation of CAD models in a physical-based virtual reality simulation. The user interacts with a virtual car mock-up using a stringbased haptic interface that provides force sensation in a large workspace. A prop is used to provide grasp feedback. A mocap system is used to track user's hand and head movements. In addition a 5DT data-glove is used to measure finger flexion. Twelve volunteer participants were instructed to remove a lamp of the virtual mock-up under different conditions. Results revealed that haptic feedback was better than additional visual feedback in terms of task completion time and collision frequency.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.