2021
DOI: 10.48550/arxiv.2109.14026
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Learning Perceptual Locomotion on Uneven Terrains using Sparse Visual Observations

Abstract: Legged robots have achieved remarkable performance in blind walking using either model-based control or data-driven deep reinforcement learning. To proactively navigate and traverse various terrains, active use of visual perception becomes indispensable, and this work aims to exploit the use of sparse visual observations to achieve perceptual locomotion over a range of commonly seen bumps, ramps, and stairs in human-centred environments. We first formulate the selection of minimal visual input that can represe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 16 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?