2020
DOI: 10.3390/app10041195
|View full text |Cite
|
Sign up to set email alerts
|

Regressed Terrain Traversability Cost for Autonomous Navigation Based on Image Textures

Abstract: The exploration of remote, unknown, rough environments by autonomous robots strongly depends on the ability of the on-board system to build an accurate predictor of terrain traversability. Terrain traversability prediction can be made more cost efficient by using texture information of 2D images obtained by a monocular camera. In cases where the robot is required to operate on a variety of terrains, it is important to consider that terrains sometimes contain spiky objects that appear as non-uniform in the text… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 17 publications
(9 citation statements)
references
References 32 publications
0
9
0
Order By: Relevance
“…In the past, several works have been proposed for the in-situ classification of terrain types, based exclusively on proprioceptive data while actually experiencing the traverse on the (potentially dangerous) terrain [ 88 , 89 ]. Among the works reported in the present survey, a recent and more frequently adopted approach consists of using proprioceptive data during training as well as for data labeling [ 34 , 35 , 66 ]. This approach has already proved to provide better classification and regression results compared to those methods based on exteroceptive data only, although only the latter are used at deployment time.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…In the past, several works have been proposed for the in-situ classification of terrain types, based exclusively on proprioceptive data while actually experiencing the traverse on the (potentially dangerous) terrain [ 88 , 89 ]. Among the works reported in the present survey, a recent and more frequently adopted approach consists of using proprioceptive data during training as well as for data labeling [ 34 , 35 , 66 ]. This approach has already proved to provide better classification and regression results compared to those methods based on exteroceptive data only, although only the latter are used at deployment time.…”
Section: Discussionmentioning
confidence: 99%
“…In a similar fashion, ekhti and Kobayashi [ 35 ] train a Gaussian process regressor (GPR) in order to predict vehicle vibration (as a measure of terrain traversability) while moving over the terrain, combined with terrain textures features detected by processing images from an on-board RGB camera. Also in this case, the regressor is trained with proprioceptive data (i.e., accelerometer data) acquired during the traversal, whereas on-line traversal cost regression is based on the incoming RGB image only.…”
Section: Terrain Traversability Analysismentioning
confidence: 99%
“…Terrain traversability analysis can be referred to as the problem of estimating the difficulty of driving through a terrain for a ground vehicle [18]. Bekhti et al use terrain images and acceleration signals to train a Gaussian process regressor in order to predict vibrations using only image texture features [19]. Maturana et al propose a real-time mapping strategy that provides a 2.5D grid map centered on the vehicle frame, encoding both geometry and semantic information of the environment [20].…”
Section: Related Workmentioning
confidence: 99%
“…Proprioceptive sensors that measure the internal state of the robot, e.g., acceleration [42,43], force [44,45], torque [45], vibration [44], are commonly used to generate supervisory signals for traversability as they directly reflect the physical robot-terrain interactions. Stavens and Thrun [42] generate the label of terrain roughness automatically from a vehicle's inertial measurements while driving.…”
Section: B Self-supervised Learning For Traversability Estimationmentioning
confidence: 99%