2012
DOI: 10.1002/rob.21409
|View full text |Cite
|
Sign up to set email alerts
|

Field trial results of planetary rover visual motion estimation in Mars analogue terrain

Abstract: This paper presents the Mojave Desert field test results of planetary rover visual motion estimation (VME) developed under the “Autonomous, Intelligent, and Robust Guidance, Navigation, and Control for Planetary Rovers (AIR‐GNC)” project. Three VME schemes are compared in realistic conditions. The main innovations of this project include the use of different features from stereo‐pair images as visual landmarks and the use of vision‐based feedback to close the path‐tracking loop. The multiweek field campaign, c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
9
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 9 publications
(9 citation statements)
references
References 26 publications
(23 reference statements)
0
9
0
Order By: Relevance
“…Although inertial navigation systems are unaffected by being underwater, they are costly and subject to drift [5]. Optical vision systems have been effectively used for navigation on land [6], in air [7] and underwater [8]. Underwater, they have the advantage in providing an absolute position referenced to the seabed.…”
Section: Introductionmentioning
confidence: 99%
“…Although inertial navigation systems are unaffected by being underwater, they are costly and subject to drift [5]. Optical vision systems have been effectively used for navigation on land [6], in air [7] and underwater [8]. Underwater, they have the advantage in providing an absolute position referenced to the seabed.…”
Section: Introductionmentioning
confidence: 99%
“…The 3D terrain data were segmented into 15 × 15 cm cells in the horizontal plane. The terrain assessment algorithm (Bakambu et al., ) is then operated on each cell of data to compute its traversability cost, based on the known ground clearance of the rover, the maximum slope that the rover is capable of traversing, and the roughness of the terrain. Based on these three criteria, the resulting cost map shows cells that are either unknown (due to a lack of data), traversable, untraversable, or part of a safety boundary based on the known footprint of the rover.…”
Section: Approach and System Overviewmentioning
confidence: 99%
“…Visual odometry (VO), which was coined by Nister (Nister, Narodisky, & Bergen, ), provides a means of estimating the position and orientation of single or multiple cameras based on image input by detecting and tracking features over time, and it has become a widespread means of relative navigation for mobile robots (Bakambu et al., ; Konolige, Agrawal, & Sola, ; Lamber et al., ; Maimone, Chang, & Matthies, ; Sibley, Mei, Reid, & Newman, ; Souvannavong, Lemaréchal, Rastel, & Maurette, ; Wagner, Wettergreen, & Iles, ). This relative localization technique was complemented by a low‐frequency absolute localization process that exploits the low‐resolution digital elevation model (DEM) derived from orbiter data.…”
Section: Introductionmentioning
confidence: 99%
“…Several studies on rover global localization used a matching of rover locally mapped terrain (using Lidar or stereovision data) and the orbiter generated Digital Elevation Model . The use of real‐time Simultaneous Localization and Mapping (SLAM) methods has also been proposed for future rovers . In particular, Das et al .…”
Section: Introductionmentioning
confidence: 99%
“…[33][34][35][36][37] The use of real-time Simultaneous Localization and Mapping (SLAM) methods has also been proposed for future rovers. [38][39][40] In particular, Das et al 41 from the University of Waterloo competed in the NASA SRR challenge using a robot equipped with a 3D Lidar based real-time pose graph SLAM. Meanwhile, significant progress has also been made in assessing terrain traversability for planetary rovers.…”
Section: Introductionmentioning
confidence: 99%