Background The estimation of relative distance is a perceptual task used extensively in everyday life. This important skill suffers from biases that may be more pronounced when estimation is based on haptics. This is especially true for the blind and visually impaired, for which haptic estimation of distances is paramount but not systematically trained. We investigated whether a programmable tactile display, used autonomously, can improve distance discrimination ability in blind and severely visually impaired youngsters between 7 and 22 years-old. Methods Training consisted of four weekly sessions in which participants were asked to haptically find, on the programmable tactile display, the pairs of squares which were separated by the shortest and longest distance in tactile images with multiple squares. A battery of haptic tests with raised-line drawings was administered before and after training, and scores were compared to those of a control group that did only the haptic battery, without doing the distance discrimination training on the tactile display. Results Both blind and severely impaired youngsters became more accurate and faster at the task during training. In haptic battery results, blind and severely impaired youngsters who used the programmable display improved in three and two tests, respectively. In contrast, in the control groups, the blind control group improved in only one test, and the severely visually impaired in no tests. Conclusions Distance discrimination skills can be trained equally well in both blind and severely impaired participants. More importantly, autonomous training with the programmable tactile display had generalized effects beyond the trained task. Participants improved not only in the size discrimination test but also in memory span tests. Our study shows that tactile stimulation training that requires minimal human assistance can effectively improve generic spatial skills. Electronic supplementary material The online version of this article (10.1186/s12984-019-0580-2) contains supplementary material, which is available to authorized users.
In this paper, we describe a recently begun project aimed at teaching of echolocation using a mobile game. The presented research concerns initial echolocation tests with real world obstacles and similar tests performed using binaural recordings. Tests that included detection and recognition of large obstacles in various environments (padded room, non-padded room, outdoors) were performed by three groups 10 persons each: blind children, blind adults and sighted adults. A mixed group of volunteers also tested binaural recordings of the same environments using a mobile application for Android and iOS devices. The presented preliminary research shows a large variance in echolocation ability of the participants. Less than 20% of the 30 volunteers could reliably (with >80% certainty) localize 1 m and 2 m wide walls at distances 1 to 3 m, while about as many showed no echolocation skills and answered at a random level. On average sighted adults performed better in echolocation tests than blind children, but worse than blind adults. Tests in outdoor environments showed much better results than indoors and a padded room was marginally better for echolocation than the non-padded room. Performance with recordings was generally worse than in analogous real tests, but the same trends could be clearly observed, e.g. a proportional drop-off of correctness with distance. The tests with recordings also demonstrated that a repeatable pre-recorded or synthesized clicker originating from a loudspeaker was a better solution than recordings with live clicker sounds.
Pin-array displays are a promising technology that allow to display visual information with touch, a crucial issue for blind and partially sighted users. Such displays are programmable, therefore can considerably increase, vary and tailor the amount of information as compared to common embossed paper and, beyond Braille, they allow to display graphics. Due to a shortage in establishing which ideal resolution allows to understand simple graphical concepts, we evaluated the discriminability of tactile symbols at different resolutions and complexity levels in blind, blindfolded low-vision and sighted participants. We report no differences in discrimination accuracy between tactile symbols organized in 3x3 as compared to 4x4 arrays. A metric based on search and discrimination speed in blind and in low-vision participants does not change at different resolutions, whereas in sighted participants it significantly increases when resolution increases. We suggest possible guidelines in designing dictionaries of low-resolution tactile symbols. Our results can help designers, ergonomists and rehabilitators to develop usable human-machine interfaces with tactual symbol coding.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.