2021
DOI: 10.48550/arxiv.2111.07723
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Observation Contribution Theory for Pose Estimation Accuracy

Abstract: The improvement of pose estimation accuracy is currently the fundamental problem in mobile robots. This study aims to improve the use of observations to enhance accuracy. The selection of feature points affects the accuracy of pose estimation, leading to the question of how the contribution of observation influences the system. Accordingly, the contribution of information to the pose estimation process is analyzed. Moreover, the uncertainty model, sensitivity model, and contribution theory are formulated, prov… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 49 publications
0
2
0
Order By: Relevance
“…Therefore, according to Ref. [33], we selected points from the point cloud and kept the points with low uncertainty for odometry estimation and the following point cloud for cross-source map registration. We can use the Jacobian matrix of the distance to measure their uncertainty.…”
Section: Odometry Estimationmentioning
confidence: 99%
See 1 more Smart Citation
“…Therefore, according to Ref. [33], we selected points from the point cloud and kept the points with low uncertainty for odometry estimation and the following point cloud for cross-source map registration. We can use the Jacobian matrix of the distance to measure their uncertainty.…”
Section: Odometry Estimationmentioning
confidence: 99%
“…Therefore, we conduct a selection of points according to Ref. [33] and retain the points with the high contribution for updating the particles' weights. This improves both the computational efficiency and the matching accuracy.…”
Section: Point Cloud To Cross-source Map Registrationmentioning
confidence: 99%