Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems 2017
DOI: 10.1145/3027063.3053201
|View full text |Cite
|
Sign up to set email alerts
|

Inferring Landmarks for Pedestrian Navigation from Mobile Eye-Tracking Data and Google Street View

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 17 publications
(10 citation statements)
references
References 24 publications
0
8
0
Order By: Relevance
“…There exists previous work that has used street level photographs as an information source for geographical information analysis, with either automated or crowdsourced data, to identify accessibility problems [13], landmarks for pedestrian navigation [20], curb ramps [14], and scenic driving routes [39], but we know of no previous work on recognizing parkourability or other physical activity opportunities.…”
Section: Analyzing Urban Imagerymentioning
confidence: 99%
“…There exists previous work that has used street level photographs as an information source for geographical information analysis, with either automated or crowdsourced data, to identify accessibility problems [13], landmarks for pedestrian navigation [20], curb ramps [14], and scenic driving routes [39], but we know of no previous work on recognizing parkourability or other physical activity opportunities.…”
Section: Analyzing Urban Imagerymentioning
confidence: 99%
“…Empirical evidence has been provided that the use of landmarks has a positive impact on wayfinding performance (see, e.g., Ross, May & Thompson, 2004;Tom & Denis, 2004) and that the absence of landmarks in an environment is compensated by an increased granularity in verbal humanto-human route instructions (see Hirtle, Richter, Srinivas & Firth, 2010). Research on incorporating landmarks (see Richter & Winter, 2014, for a thorough overview of the concept) in route instructions for wayfinding assistance systems has, consequently, become a predominant research topic, including modeling (see, e.g., Caduff & Timpf, 2008;Nothegger, Winter & Raubal, 2004;Nuhn & Timpf, 2017;Raubal & Winter, 2002;Winter, 2003), empirical assessment (see, e.g., Götze & Boye, 2016;Kattenbeck, 2017;Kattenbeck, Nuhn & Timpf, 2018;Quesnot & Roche, 2015) of salience and the automatic selection of landmarks (see, e.g., Duckham, Winter & Robinson, 2010;Lander, Herbig, Löchtefeld, Wiehr & Krüger, 2017;Lazem & Sheta, 2005;Rousell & Zipf, 2017;Wang & Ishikawa, 2018).…”
Section: Research On Route Instructionsmentioning
confidence: 99%
“…Urban Pulse [58] uses computational topology techniques and data from Twitter and Flickr to visualize spatio-temporal activity across various resolutions, and Urban Space Explorer [42] proposes an approach to explore public-space-related activity. Another is using computer vision algorithms to assess the built environment through street-view images, for example to: assess and map greenery and openness in urban settings [51,49], quantify the daily exposure of urban residents to eye-level street greenery [90], extract land use information [50], measure visual quality [82], visual enclosure [91], urban form [80] and sky exposure [19] of street spaces, and assess traffic signs [11], curb ramps [37] and urban landmarks [46]. Street-level images have also been used to predict relationships between a city's built environment and socioeconomic conditions [5], and as the basis for automatically extracting a city's most distinctive visual elements [25].…”
Section: Related Workmentioning
confidence: 99%