2014 IEEE International Conference on Robotics and Biomimetics (ROBIO 2014) 2014
DOI: 10.1109/robio.2014.7090403
|View full text |Cite
|
Sign up to set email alerts
|

A multi-sensor-based mobile robot localization framework

Abstract: This paper presents a novel multi-sensor-based robot localization framework inspired by human coarse-to-fine recognition mechanism to realize fast and robust localization in the process of robot navigation. This localization framework consists of two parts: coarse place recognition and accurate location estimation. The coarse place recognition is realized using an onboard camera, whereas an image retrieval system is employed. The coarse localization system utilizes feature matching between the observed image a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
6
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 21 publications
0
6
0
Order By: Relevance
“…However, the use of such sensors can be very costly. Another solution is to use different sensors of lower accuracy and then apply a sensor fusion technique to the readings of all sensors in order to get a better estimate of the vehicle’s pose (position and orientation) using multiple low cost sensors [1,2,3,4,5,6].…”
Section: Introductionmentioning
confidence: 99%
“…However, the use of such sensors can be very costly. Another solution is to use different sensors of lower accuracy and then apply a sensor fusion technique to the readings of all sensors in order to get a better estimate of the vehicle’s pose (position and orientation) using multiple low cost sensors [1,2,3,4,5,6].…”
Section: Introductionmentioning
confidence: 99%
“…As the field of autonomous driving continues to evolve, mobile robots with high mobility, such as ground mobile robotic vehicles and lunar rovers, play a pivotal role in mission execution [ 1 , 2 ]. These mobile robots are equipped with a variety of sensors, including navigation cameras (Navcams), hazard avoidance cameras (Hazcams), light detection and ranging (LiDAR), time-of-flight depth cameras (TOF), inertial measurement units (IMU), and others, which enable them to gather diverse information about the surroundings.…”
Section: Introductionmentioning
confidence: 99%
“…However, such sensors are usually costly and their accuracy degrades in remote areas or in GPS denied environments. An alternative solution is to use different low cost sensors of lower accuracy, and to apply a sensor fusion technique (Khaleghi et al, 2013) in order to get a better pose estimate of the autonomous system (Al-Kaff et al, 2018;Duan et al, 2014;Kelly and Sukhatme, 2011;Lu et al, 2017;Luo and Chang, 2012;Magrin and Todt, 2016;Osman et al, 2019a;Urmson et al, 2008).…”
Section: Introductionmentioning
confidence: 99%