2017
DOI: 10.48550/arxiv.1711.08040
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Identifying Most Walkable Direction for Navigation in an Outdoor Environment

Abstract: We present an approach for identifying the most walkable direction for navigation using a hand-held camera. Our approach extracts semantically rich contextual information from the scene using a custom encoder-decoder architecture for semantic segmentation and models the spatial and temporal behavior of objects in the scene using a spatio-temporal graph. The system learns to minimize a cost function over the spatial and temporal object attributes to identify the most walkable direction. We construct a new annot… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 53 publications
(89 reference statements)
0
1
0
Order By: Relevance
“…For instance, when mapping in the pedestrian environment, it is important to identify which connected regions are correctly mapped to real pathways. In sum, uses of semantic segmentation are not pixelbut region-based, as evidenced in their application to aggregate semantically rich information about the surroundings used to assess navigation options for people with visual disabilities (e.g., Mehta et al [11] and Yang et al [12]).…”
Section: Introductionmentioning
confidence: 99%
“…For instance, when mapping in the pedestrian environment, it is important to identify which connected regions are correctly mapped to real pathways. In sum, uses of semantic segmentation are not pixelbut region-based, as evidenced in their application to aggregate semantically rich information about the surroundings used to assess navigation options for people with visual disabilities (e.g., Mehta et al [11] and Yang et al [12]).…”
Section: Introductionmentioning
confidence: 99%