Introduction Technological advances have introduced three-dimensional (3-D) printing as an option for creating tactile maps for people with visual impairments (that is, those who are blind or have low vision), diversifying the types of map products that are available. At the same time, it presents a challenge to map makers to implement designs across multiple production methods. We evaluated map symbols to determine their discriminability across three different materials: microcapsule paper, 3-D printer plastic, and embossed paper. Methods In a single session lasting less than 90 minutes, participants completed a matching task and provided informal feedback regarding their preferences. We measured speed and accuracy to establish discriminability of map symbols on each of the materials. Eighteen participants were recruited from a referred sample among attendees at the American Council of the Blind annual convention in 2013. Results Response times were significantly different across the three materials (p < 0.001). Without sacrificing accuracy, response times were faster for the 3-D printed graphics than for either the microcapsule paper (p < 0.001) or the embossed paper (p < 0.001). User preference was divided across the three materials. Some people disliked the “sharp” corners of the 3-D printed symbols, while others preferred their “crisp” edges. Discussion Our results demonstrate faster discriminability of a set of tactile symbols produced on a 3-D printer compared to those same symbols printed on microcapsule paper, the material for which the symbols were originally designed. Participant feedback reflected preferences both in favor of and against reading symbols produced on the 3-D printer. Implications for practitioners This article discusses the functional equivalence of tactile symbols produced across multiple production technologies. It addresses two considerations when using 3-D printing to make tactile maps: preparing digital files for printing and the printing work flow. Digital files ready for printing on each of the three materials are available for download (Brittell, Lobben, & Lawrence 2016).
Sonification of geospatial data must situate data values in two (or three) dimensional space. The need to position data values in space distinguishes geospatial data from other multi-dimensional data sets. While cartographers have extensive experience preparing geospatial data for visual display, the use of sonification is less common. Beyond availability of tools or visual bias, an incomplete understanding of the implications of parameter mappings that cross conceptual data categories limits the application of sonification to geospatial data. To catalyze the use of audio in cartography, this paper explores existing examples of parameter mapping sonification through the framework of the geographic data cube. More widespread adoption of auditory displays would diversify map design techniques, enhance accessibility of geospatial data, and may also provide new perspective for application to non-geospatial data sets.
Spatial data are increasingly available, but the ubiquitous use of graphical displays to communicate such data renders it inaccessible to people who are blind or low vision. Not only does this affect the level of access to data, it also results in limited educational opportu nities due to a lack of accessible maps and geographic information systems. This lack may be due in part to the challenge of creating a system that provides a usable display without relying on vision. A simple replacement of symbology from a map intended for a two dimensional graphical display with parameters for other modalities such as audio with one primary axis (time) is insufficient.To address the need for an accessible learning materials, we present a minimal geographic information system (mGIS) that uses an au ditory display in combination with a tablet with stylus input device. Non-speech audio communicates attribute data, text-to-speech soft ware renders feedback from the application menus, and kinesthetic feedback from actively controlling the stylus conveys location within the display. This paper presents details of the software imple mentation, discusses the development of an auditory symbology for choropleth maps (maps that display patterns of data over geo graphic space), and describes initial evaluation of usability.
Encoding cursor position and directional information in synthesized audio feedback facilitates line following. This technique will aid interpretation and spatial understanding of irregularly shaped line features (e.g. rivers, state boundaries) making maps more accessible to users who are blind or visually impaired.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.