Navigation assistive technologies have been designed to support the mobility of people who are blind and visually impaired during independent navigation by providing sensory augmentation, spatial information and general awareness of their environment. This paper focuses on the extended Usability and User Experience (UX) evaluation of BlindRouteVision, an outdoor navigation smartphone application that tries to efficiently solve problems related to the pedestrian navigation of visually impaired people without the aid of guides. The proposed system consists of an Android application that interacts with an external high-accuracy GPS sensor tracking pedestrian mobility in real-time, a second external device specifically designed to be mounted on traffic lights for identifying traffic light status and an ultrasonic sensor for detecting near-field obstacles along the route of the blind. Moreover, during outdoor navigation, it can optionally incorporate the use of Public Means of Transport, as well as provide multiple other uses such as dialing a call and notifying the current location in case of an emergency. We present findings from a Usability and UX standpoint of our proposed system conducted in the context of a pilot study, with 30 people having varying degrees of blindness. We also received feedback for improving both the available functionality of our application and the process by which the blind users learn the features of the application. The method of the study involved using standardized questionnaires and semi-structured interviews. The evaluation took place after the participants were exposed to the system’s functionality via specialized user-centered training sessions organized around a training version of the application that involves route simulation. The results indicate an overall positive attitude from the users.
A reliable state-of-the-art obstacle detection algorithm is proposed for a mobile application that will analyze in real time the data received by an external sonar device and decide the need to audibly warn the blind person about near field obstacles. The proposed algorithm can equip an orientation and navigation device that allows the blind person to walk safely autonomously outdoors. The smartphone application and the microelectronic external device will serve as a wearable that will help the safe outdoor navigation and guidance of blind people. The external device will collect information using an ultrasonic sensor and a GPS module. Its main objective is to detect the existence of obstacles in the path of the user and to provide information, through oral instructions, about the distance at which it is located, its size and its potential motion and to advise how it could be avoided. Subsequently, the blind can feel more confident, detecting obstacles via hearing before sensing them with the walking cane, including hazardous obstacles that cannot be sensed at the ground level. Besides presenting the micro-servo-motor ultrasonic obstacle detection algorithm, the paper also presents the external microelectronic device integrating the sonar module, the impulse noise filtering implementation, the power budget of the sonar module and the system evaluation. The presented work is an integral part of a state-of-the-art outdoor blind navigation smartphone application implemented in the MANTO project.
Attempting to establish aids for individuals who are visually impaired has urged many cities to seek solutions for improving their quality of life. Namely, cities have installed sound-emitting devices into traffic lights as well as sidewalks that assist their navigation. Moreover, as cities are always striving to move forward and achieve innovations concerning navigation for disabled individuals, smart traffic lights, capable of synchronizing in real-time according to traffic and individual mobility conditions, are already being installed around the world. This is in line with the adoption of the smart city concept, which involves a set of methodologies and indicators that regulate how cities perform regarding the promotion of citizens’ quality of life. Another important principle is the techno-economic aspect indicating the need for low-cost careful planning to produce cost-efficient solutions, while additional important issues are maintenance, power efficiency, and the means to coordinate numerous devices to facilitate operation in a timely and reliable manner. In this article, we present an overview of the existing solutions for the navigation of people who are blind and visually impaired along with a requirement analysis performed on the feedback received from interviews with members of the Lighthouse for the Blind of Greece both of which lead to the proposal of a new implementation that pushes the state of the art.
Training blind and visually impaired individuals is an important but often neglected aspect of Assistive Technology solutions (ATs) that can benefit from systems utilizing multiple sensors and hardware devices. Training serves a dual purpose as it not only enables the target group to effectively utilize the ATs but, also, helps in improving their low acceptance rate. In this paper, we present the design, implementation, and validation of a smartphone-based training application. It is a form of immersive system that enables users to learn the features of an outdoor blind pedestrian navigation application and, simultaneously, to help them develop long-term Orientation and Mobility (O&M) skills. The system consists of an Android application leveraging, as data sources, an external high-accuracy GPS sensor for real-time pedestrian mobility tracking, a second custom-made device attached to traffic lights for identifying their status, and an ultra-sonic sensor for detecting near-field obstacles on the navigation path of the users. The training version running as an Android application employs route simulation with audio and haptic feedback, is functionally equivalent to the main application, and was used in the context of specially designed user-centered training sessions. A Usability and User Experience (UX) evaluation revealed the positive attitude of the users towards the training version as well as their satisfaction with the skills acquired during their training sessions (SUS = 69.1, UEQ+ = 1.53). Further confirming the positive attitude was the conduct of a Recursive Neural Network (RNN)-based sentiment analysis on user responses with a score of 3 on a scale from 0 to 4. Finally, we conclude with the lessons learned and the proposal of general design guidelines concerning the observed lack of accessibility and non-universal interfaces.
No abstract
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.