This study investigated the localization ability of an impulse vibration source outside the body in two-dimensional space. We tested whether humans can recognize the direction or distance of an impulse vibration source when using their hand to detect spatiotemporal vibrotactile information provided by the propagated vibrational wave from the source. Specifically, we had users put their hands on a silicone rubber sheet in several postures. We asked users to indicate the position of the vibration source when a location on the sheet was indented. Experimental results suggested that the direction of the impact vibration source can be recognized to some extent, although recognition accuracy depends on hand posture and the position of the vibration source. The best results were achieved when the fingers and palm were grounded and a vibration source was presented around the middle fingertip, and the directional recognition error in this case was 6 • . In contrast, results suggest it is difficult to accurately recognize the distance of the vibration. The results of this study suggest a new possibility for directional display where vibrotactile actuators are embedded at a distance from the user's hand.
Fig. 1. Our proposed method modulates the fine roughness perception of vibrotactile textured surfaces using pseudo-haptic effect. Our user study showed that users felt the surface rougher in response to a parameter configuration of visual feedback.Abstract-Playing back vibrotactile signals through actuators is commonly used to simulate tactile feelings of virtual textured surfaces. However, there is often a small mismatch between the simulated tactile feelings and intended tactile feelings by tactile designers. Thus, a method of modulating the vibrotactile perception is required. We focus on fine roughness perception and we propose a method using a pseudo-haptic effect to modulate fine roughness perception of vibrotactile texture. Specifically, we visually modify the pointer's position on the screen slightly, which indicates the touch position on textured surfaces. We hypothesized that if users receive vibrational feedback watching the pointer visually oscillating back/forth and left/right, users would believe the vibrotactile surfaces more uneven. We also hypothesized that as the size of visual oscillation is getting larger, the amount of modification of roughness perception of vibrotactile surfaces would be larger. We conducted user studies to test the hypotheses. Results of first user study suggested that users felt vibrotactile texture with our method rougher than they did without our method at a high probability. Results of second user study suggested that users felt different roughness for vibrational texture in response to the size of visual oscillation. These results confirmed our hypotheses and they suggested that our method was effective. Also, the same effect could potentially be applied to the visual movement of virtual hands or fingertips when users are interacting with virtual surfaces using their hands.
Providing vibrotactile feedback that corresponds to the state of the virtual texture surfaces allows users to sense haptic properties of them. However, hand-tuning such vibrotactile stimuli for every state of the texture takes much time. Therefore, we propose a new approach to create models that realize the automatic vibrotactile generation from texture images or attributes. In this paper, we make the first attempt to generate the vibrotactile stimuli leveraging the power of deep generative adversarial training. Specifically, we use conditional generative adversarial networks (GANs) to achieve generation of vibration during moving a pen on the surface. The preliminary user study showed that users could not discriminate generated signals and genuine ones and users felt realism for generated signals. Thus our model could provide the appropriate vibration according to the texture images or the attributes of them. Our approach is applicable to any case where the users touch the various surfaces in a predefined way.
In the last two decades, the design of pseudo-haptics as a haptic presentation method that does not require a mechanical feedback device has been proposed in various research papers. Moreover, applications using pseudo-haptics have been proposed and evaluated in various contexts. However, the findings from these studies have not yet been comprehensively organized in a survey paper in the recent times. In this article, findings from a series of individual prior studies were summarized from the design through to the application proposals. First, we summarize visual stimuli designs based on the target haptic object properties to induce pseudo-haptics. Second, we summarize two special issues when designing pseudo-haptics; (1) workaround design for the visualized mismatch of visual stimuli and user input; and (2) the combination design of pseudo-haptics and physical stimuli. Third, application proposals that use pseudohaptics for training, assistance, and entertainment are presented. This survey paper would help not only researchers in academia but also application developers who intend to use pseudo-haptics as a haptic presentation method.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.