2014
DOI: 10.1007/978-3-319-08599-9_16
|View full text |Cite
|
Sign up to set email alerts
|

ARGUS Autonomous Navigation System for People with Visual Impairments

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 2 publications
0
2
0
Order By: Relevance
“…Thirdly, in addition to SSD, several systems used virtual sound sources to intuitively guide users along a route, e.g., [ 34 , 35 ]. As appreciated in PVAS implementation and the corresponding user tests, these acoustic cues are supported by VES; in fact, these were the only cues available to reach the three columns in the first VES navigation experiment ( Figure 10 and Figure 11 b).…”
Section: Discussionmentioning
confidence: 99%
“…Thirdly, in addition to SSD, several systems used virtual sound sources to intuitively guide users along a route, e.g., [ 34 , 35 ]. As appreciated in PVAS implementation and the corresponding user tests, these acoustic cues are supported by VES; in fact, these were the only cues available to reach the three columns in the first VES navigation experiment ( Figure 10 and Figure 11 b).…”
Section: Discussionmentioning
confidence: 99%
“…Other interfaces incorporated non-linguistic solutions, ranging from spatialized audio to haptic displays. For instance, Head-Related Transfer Function (HRTF) sound processing and motion capture systems were applied to guide BVI users along a route by reproducing virtual sound sources at intermediate points [9]. Also, hand-held devices were developed which provided verbal messages according to the position pointed at, or tapped, on a map [10,11]; others change shape to provide heading cues [12].…”
Section: Related Workmentioning
confidence: 99%