Abstract-Wearable computers can certainly support audio-only presentation of information; a visual interface need not be present for effective user interaction. A System for Wearable Audio Navigation (SWAN) is being developed to serve as a navigation and orientation aid for persons temporarily or permanently visually impaired. SWAN is a wearable computer consisting of audio-only output and tactile input via a taskspecific handheld interface device. SWAN aids a user in safe pedestrian navigation and includes the ability for the user to author new GIS data relevant to their needs of wayfinding, obstacle avoidance, and situational awareness support. Emphasis is placed on representing pertinent data with nonspeech sounds through a process of sonification. SWAN relies on a Geographic Information System (GIS) infrastructure for supporting geocoding and spatialization of data. Furthermore, SWAN utilizes a novel tracking system.
Objective: We examined whether spatialized nonspeech beacons could guide navigation and how sound timbre, waypoint capture radius, and practice affect performance. Background: Auditory displays may assist mobility and wayfinding for those with temporary or permanent visual impairment, but they remain understudied. Previous systems have used speech-based interfaces. Method: Participants (108 undergraduates) navigated three maps, guided by one of three beacons (pink noise, sonar ping, or 1000-Hz pure tone) spatialized by a virtual reality engine. Dependent measures were efficiency of time and path length. Results: Overall navigation was very successful, with significant effects of practice and capture radius, and interactions with beacon sound. Overshooting and subsequent hunting for waypoints was exacerbated for small radius conditions. A human-scale capture radius (1.5 m) and sonar-like beacon yielded the optimal combination for safety and efficiency. Conclusion: The selection of beacon sound and capture radius depend on the specific application, including whether speed of travel or adherence to path are of primary concern. Extended use affects sound preferences and quickly leads to improvements in both speed and accuracy. Application: These findings should lead to improved wayfinding systems for the visually impaired as well as for first responders (e.g., firefighters) and soldiers.
There is a critical need for navigation and orientation aids for the visually impaired. Developing such displays is difficult and time consuming due to the lack of design tools and guidelines, the inefficiency of trial-and-error design, and experimental participant safety concerns. We discuss using a virtual environment (VE) to help in the design, evaluation, and iterative refinement of an auditory navigation system. We address questions about the (real) interface that the VE version allows us to study. Examples include sound design, system behavior, and user interface design. Improved designs should result from a more systematic and scientific method of assistive technology development. We also point out some of the ongoing caveats that researchers in this field need to consider, especially relating to external validity and over-reliance on VE for design solutions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.