2017
DOI: 10.1080/17489725.2017.1323125
|View full text |Cite
|
Sign up to set email alerts
|

Maps, vibration or gaze? Comparison of novel navigation assistance in indoor and outdoor environments

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
7
1

Relationship

3
5

Authors

Journals

citations
Cited by 14 publications
(8 citation statements)
references
References 34 publications
0
8
0
Order By: Relevance
“…Novel interaction modalities and paradigms, and context-aware user interfaces, are available nowadays. In addition to traditional user interfaces through which people can interact with textbased information or cartographic maps, novel interaction modes, such as audio, gesture, gaze, or vibration (Gkonos et al 2017), and displays integrating augmented and virtual reality exist (Rudi et al 2016).…”
Section: Geosmartnessmentioning
confidence: 99%
See 1 more Smart Citation
“…Novel interaction modalities and paradigms, and context-aware user interfaces, are available nowadays. In addition to traditional user interfaces through which people can interact with textbased information or cartographic maps, novel interaction modes, such as audio, gesture, gaze, or vibration (Gkonos et al 2017), and displays integrating augmented and virtual reality exist (Rudi et al 2016).…”
Section: Geosmartnessmentioning
confidence: 99%
“…When a decision point with different options is approached, the user starts to examine the possible ones to follow. At the moment when the user's gaze is aligned with the correct street, the system automatically provides feedback to convey this, for example through a vibrotactile belt or, more effectively, its combination with gaze information (Gkonos et al 2017). Systems for real-time gaze tracking in outdoor environments, which map the gazes from a mobile eye tracker to a georeferenced view using computer vision methods, allow for such personalized gaze-based decision support (Anagnostopoulos et al 2017).…”
Section: Personalized Gaze-based Decision Supportmentioning
confidence: 99%
“…maps and voices) to achieve better user experience in LBS. Research attention has also been drawn to compare the effectiveness of different interfaces in LBS (Huang, Schmidt, and Gartner 2012;Gkonos, Giannopoulos, and Raubal 2017). In terms of interface technologies and devices, smartphones are not the only mobile client in LBS.…”
Section: Towards Non-intrusive User Interfacesmentioning
confidence: 99%
“…On the other hand, the spatial arrangement of indoor spaces differs from that of outdoor spaces. For example, multiple kinds of object can be regarded as landmarks in outdoor environments [25], such as churches, shopping malls, and bridges. However, these objects cannot be regarded as landmarks in indoor environments [26].…”
Section: Indoor Landmark Salience Modelsmentioning
confidence: 99%