2010
DOI: 10.3390/s110100062
|View full text |Cite
|
Sign up to set email alerts
|

Analysis of Different Feature Selection Criteria Based on a Covariance Convergence Perspective for a SLAM Algorithm

Abstract: This paper introduces several non-arbitrary feature selection techniques for a Simultaneous Localization and Mapping (SLAM) algorithm. The feature selection criteria are based on the determination of the most significant features from a SLAM convergence perspective. The SLAM algorithm implemented in this work is a sequential EKF (Extended Kalman filter) SLAM. The feature selection criteria are applied on the correction stage of the SLAM algorithm, restricting it to correct the SLAM algorithm with the most sign… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0

Year Published

2012
2012
2021
2021

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(23 citation statements)
references
References 20 publications
0
23
0
Order By: Relevance
“…Let z ∈ N be a feature -observation-from the environment and R the covariance matrix of the corresponding observation noise-obtained from the feature extraction procedure. Equation (3) implies that the most significant feature in N at time k is the one that causes the largest decrement of the uncertainty volume of the corrected covariance matrix of the Gaussian estimation algorithm (see [29]). Therefore, the selection criterion implemented in this work consists of using only the optimal features found in Equation (3) to correct the estimation algorithm.…”
Section: General Proposalmentioning
confidence: 99%
See 2 more Smart Citations
“…Let z ∈ N be a feature -observation-from the environment and R the covariance matrix of the corresponding observation noise-obtained from the feature extraction procedure. Equation (3) implies that the most significant feature in N at time k is the one that causes the largest decrement of the uncertainty volume of the corrected covariance matrix of the Gaussian estimation algorithm (see [29]). Therefore, the selection criterion implemented in this work consists of using only the optimal features found in Equation (3) to correct the estimation algorithm.…”
Section: General Proposalmentioning
confidence: 99%
“…Thus, under the same hypothesis of Lemma V, for the EKF case: |P | Ra ≥ |P | R b ; and, for the EIF case: |Ω| Ra ≤ |Ω| R b . Both expressions can be directly obtained from Equations (2.1) and (12), considering that a same feature has a same Jacobian matrix associated with it [29].…”
Section: The Unscented Kalman Filtermentioning
confidence: 99%
See 1 more Smart Citation
“…The new state vector includes new landmarks (whose initialization has just been performed) and previously used landmarks. Auat Cheein and Carelli [26] proposes an efficient method to select landmarks for the estimation task. It is based on the evaluation of the influence of a given feature on the convergence of the state covariance matrix.…”
Section: Map Managementmentioning
confidence: 99%
“…The method matches all possible landmarks and computes (I − K k H k ) from Equation (12). Unfortunately, we cannot implement it exactly as proposed by [26] due to the high computing time. We chose to add the landmarks, based on the previous estimation step, by selecting the previous landmarks which have the best previous influence on the convergence of the state covariance matrix.…”
Section: Map Managementmentioning
confidence: 99%