2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2015
DOI: 10.1109/iros.2015.7353363
|View full text |Cite
|
Sign up to set email alerts
|

Robust incremental SLAM with consistency-checking

Abstract: Abstract-Both landmark measurements and loop closures are used to correct for odometry drift in SLAM solutions. However, if any of the measurements are incorrect (e.g. due to perceptual aliasing) standard SLAM algorithms fail catastrophically and can not return an accurate map. A number of algorithms have been proposed that are robust to loop closure errors, but it is shown in this paper that they can not provide robust solutions when landmark measurement errors occur. The root cause of the problem is that mos… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
28
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
5
2
2

Relationship

1
8

Authors

Journals

citations
Cited by 34 publications
(28 citation statements)
references
References 20 publications
0
28
0
Order By: Relevance
“…Recently in the SLAM community Graham [4] used an expectation maximization (EM) approach to smoothly transition poorly matched measurements to assumed outliers. The EM algorithm iterates between covariance weight selection and optimal variable assignments and suppresses "misbehaved" measurements by emphasizing the majority of constraints with consensus.…”
Section: Previous Workmentioning
confidence: 99%
“…Recently in the SLAM community Graham [4] used an expectation maximization (EM) approach to smoothly transition poorly matched measurements to assumed outliers. The EM algorithm iterates between covariance weight selection and optimal variable assignments and suppresses "misbehaved" measurements by emphasizing the majority of constraints with consensus.…”
Section: Previous Workmentioning
confidence: 99%
“…Similar ideas appear in the context of robust estimation, e.g., the Penalized Trimmed Squares estimator of Zioutas and Avramidis, [33]. Latif et al in [20] and Graham et al in [11] look for "internally consistent" constraints, which are in mutual agreement. Carlone et al in [6] use 1 -relaxation to find a large set of mutually-consistent measurements.…”
Section: Introductionmentioning
confidence: 81%
“…More recently, Graham [76] suggests using an expectation maximization (EM) approach to smoothly transition poorly matched measurements to assumed "outliers" by adjusting their measurement covariance. The EM algorithm is used to iterate between covariance weight selection and optimal variable assignments and thereby suppresses outlier-like measurements and emphasizing the majority of constraints which form consensus.…”
Section: Null-hypothesis Approachesmentioning
confidence: 99%
“…For example, consider a point on the lower right just above the unimodal parametric curve: (i) What is the complexity of tracking a limited number of hypotheses across the entire solution (FastSLAM [81]) versus tracking more modes in regions of the solution? (ii) Can we do better than assumed "outlier" approaches where measurements are de-weighted as null-hypotheses, such as switch type variables [76,206,219]? (iii) Can we select a piece of the problem where clear uncertainties exist and then treat just that portion in some special manner, leaving the rest as a conventional max-product parametric solution?…”
Section: Multi-modality: Displacing Assumptionsmentioning
confidence: 99%