2020
DOI: 10.1177/0145482x20941312
|View full text |Cite
|
Sign up to set email alerts
|

Environmental Information Required by Individuals with Visual Impairments Who Use Orientation and Mobility Aids to Navigate Campuses

Abstract: Introduction: This study investigated the user requirements of individuals with visual impairments regarding the information to be included in orientation and mobility (O&M) aids in order for optimally useful audio-tactile maps of campuses to be developed. In addition, this study aimed at investigating the importance (usefulness) that individuals with visual impairments attribute to environmental information of campuses. Methods: The researchers listed 213 pieces of environmental information concerning cam… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 28 publications
0
5
0
Order By: Relevance
“…Navigation tools that collect data on the behavioral and cognitive underpinnings of the user can reduce gender differences in the process of wayfinding ( Martens and Antonenko, 2012 ). In addition, navigation tools for people with visual impairments ( Nam et al., 2015 ; Oliveira et al., 2018 ; Papadopoulos et al., 2020 ; Rey-Galindo et al., 2020 ), people with autism ( Irish, 2019 ; Yang et al., 2021 ), and people with dementia ( Blackman et al., 2007 ; Cm et al., 2020 ; Gresham et al., 2019 ; Marquardt and Schmieg, 2009 ) are increasingly being studied for wayfinding solutions and techniques.…”
Section: Discussionmentioning
confidence: 99%
“…Navigation tools that collect data on the behavioral and cognitive underpinnings of the user can reduce gender differences in the process of wayfinding ( Martens and Antonenko, 2012 ). In addition, navigation tools for people with visual impairments ( Nam et al., 2015 ; Oliveira et al., 2018 ; Papadopoulos et al., 2020 ; Rey-Galindo et al., 2020 ), people with autism ( Irish, 2019 ; Yang et al., 2021 ), and people with dementia ( Blackman et al., 2007 ; Cm et al., 2020 ; Gresham et al., 2019 ; Marquardt and Schmieg, 2009 ) are increasingly being studied for wayfinding solutions and techniques.…”
Section: Discussionmentioning
confidence: 99%
“…Researchers have looked at a numerous technologies, like artificial intelligence, to identify navigational aids that work for those who are visually impaired. Recently, there has been a growing exploration of deep learning models for navigation aid systems' obstacle detection [2][3][4][5]. Even if there are a lot of object detection models available, choosing one that is appropriate for a real-time navigational environment that requires minimal memory footprint and short inference time requires rigorous research and analysis.…”
Section: Introductionmentioning
confidence: 99%
“…Determining which elements of an interior environment should be included or excluded and then finding good representations for the tactile map components is challenging [30,31]. Some previous attempts at creating 3D-printed maps were focused on using information provided for sighted people such as 2D maps illustrations, or photographic imagery, such as Google Maps, and applying braille text, [30].…”
Section: Introductionmentioning
confidence: 99%
“…Although some recent research points to the use of 3D-printed iconic symbols [40], the results from these and other studies demonstrate that the function (i.e., the readability and representation) of a symbol must drive the design and inclusion parameters. Additionally, an overview of the previous tactile maps in [31,40] mentions that the users were specifically interested in safety and navigation information, entrances/exits, and indications of hazardous areas. In this study we present our iterative development process and the resulting set of new tactile encodings for improved navigation of interior spaces that were designed using continuous user feedback.…”
Section: Introductionmentioning
confidence: 99%