We present a wearable assistance system for visually impaired persons that perceives the environment with a stereo camera system and communicates obstacles and other objects to the user in form of intuitive acoustic feedback. The system is designed to complement traditional assistance aids. We describe the core techniques of scene understanding, head tracking, and sonification and show in an experimental study that it enables users to walk in unknown urban terrain and to avoid obstacles safely.
The work discusses the performance of a Stochastic Cloning Unscented Kalman filter (SC-UKF) which is used to fuse the incremental position and orientation information from the Visual Odometry (VO) using a stereo camera setup and the absolute attitude obtained from a low-cost inertial measurement unit. The system is designed for pedestrian tracking within an uncontrolled environment and employs a quaternion-based attitude representation within the filter state. The attitude is cloned and kept between lower rate VO samples, while inertial data is processed in real-time with a higher sampling rate. Corresponding to the same time span, the relative orientation from the VO is used to correct the IMU-based rotation difference between the cloned and the current attitude. The information of magnetic compass is included in order to improve the heading estimation along with the mechanism for magnetic disturbance compensation. The filter scheme is extended by implementing the INS mechanization equations for position estimation, where the VO data is used as a velocity observation to reduce the growth of the rate of the position error. The performance of the designed SC-UKF is compared to the one of SC-based Extended Kalman filter on a number of representative walking paths. The augmented system shows a better performance especially for the indoor segments such as corridors with insufficient illumination and stairs with monotone walls.
We present a wearable assistance system for visually impaired persons that perceives the environment with a stereo camera and communicates obstacles and other objects to the user. We develop our idea of combining perception on an increased level of scene understanding with acoustic feedback to obtain an intuitive mobility aid. We describe our core techniques of scene modelling, object tracking, and acoustic feedback and show in an experimental study how our system can help improving the mobility and safety of visually impaired users.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.