2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2019
DOI: 10.1109/iros40897.2019.8967686
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Sensor 6-DoF Localization For Aerial Robots In Complex GNSS-Denied Environments

Abstract: The need for robots autonomously navigating in more and more complex environments has motivated intense R&D efforts in making robot pose estimation more accurate and reliable. This paper presents a multi-sensor multi-hypothesis method for robust 6-DoF localization in complex environments. Robustness and accuracy requirements are addressed as follows. First, camera and LIDAR features are seamlessly integrated in the same statistical framework, benefiting from their synergies and providing robustness in scenario… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0
1

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 17 publications
(12 citation statements)
references
References 18 publications
0
11
0
1
Order By: Relevance
“…It refines the parameters of the pipe (location and orientation) following a recursive Prediction-Update approach. In the Prediction stage, the robot odometry -computed from 3D LiDaR scans as in [12]-is used to predict the pipe position and orientation in the point cloud. In the Update stage the Random Sample Consensus (RANSAC) algorithm is used for computing the pipe parameters that best fit within the point cloud.…”
Section: Methodsmentioning
confidence: 99%
“…It refines the parameters of the pipe (location and orientation) following a recursive Prediction-Update approach. In the Prediction stage, the robot odometry -computed from 3D LiDaR scans as in [12]-is used to predict the pipe position and orientation in the point cloud. In the Update stage the Random Sample Consensus (RANSAC) algorithm is used for computing the pipe parameters that best fit within the point cloud.…”
Section: Methodsmentioning
confidence: 99%
“…Finally, at small distances, stereo cameras can be applied and fuse the point clouds resulting from the camera with those provided by the 3-D LIDAR. In [38], a multisensor six-DoFs localization method is applied. An iterative closest point (ICP) algorithm extended to consider 3-D-3-D matchings between LIDAR (3-D distances) or cameras (distances in the SURF space) is used in the prediction stage.…”
Section: B Perceptionmentioning
confidence: 99%
“…The current third generation of aerial manipulators includes several advanced features: operations in both outdoor/indoor environments [28], fully actuated platforms [29]- [32], multiple arms [33]- [36], GNSS free navigation capabilities, on-board SLAM [37], [38], on-board perception without markers [39], [40], off-board real-time planning with control awareness [41], [42], and on-board reactivity and planning. Applications include structure assembly, contact-based inspection in refineries (pipes and tanks), and bridges [43], [44].…”
mentioning
confidence: 99%
“…LIDAR sensors are also very widely used mainly as navigation sensor [24] [25]. They use the reflections of pulsed laser light on object surfaces to provide distance measurements.…”
Section: Event Cameras For Aerial Robot Perceptionmentioning
confidence: 99%