2015
DOI: 10.3390/s150923431
|View full text |Cite
|
Sign up to set email alerts
|

Kullback-Leibler Divergence-Based Differential Evolution Markov Chain Filter for Global Localization of Mobile Robots

Abstract: One of the most important skills desired for a mobile robot is the ability to obtain its own location even in challenging environments. The information provided by the sensing system is used here to solve the global localization problem. In our previous work, we designed different algorithms founded on evolutionary strategies in order to solve the aforementioned task. The latest developments are presented in this paper. The engine of the localization module is a combination of the Markov chain Monte Carlo samp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 51 publications
0
4
0
Order By: Relevance
“…In fact, different sample sizes contain different propensities of probability information. Here we use the Kullback-Leibler (KL) divergence [38] to quantify the difference in probability information between 100 elements of sampled data (initial dataset) and other sample sizes (N = 20, 30, 40, 50, 60, and 80) of stagger-angle error data. These measured data have been shown in Section 4.1.…”
Section: Kullback-leibler Divergence Analysismentioning
confidence: 99%
“…In fact, different sample sizes contain different propensities of probability information. Here we use the Kullback-Leibler (KL) divergence [38] to quantify the difference in probability information between 100 elements of sampled data (initial dataset) and other sample sizes (N = 20, 30, 40, 50, 60, and 80) of stagger-angle error data. These measured data have been shown in Section 4.1.…”
Section: Kullback-leibler Divergence Analysismentioning
confidence: 99%
“…Also, quotient distance is used (with normalization which results in non-symmetry) in the Shannon's or entropy-type metrics, e.g., Kullback-Leibler Divergence (KLD) and Jeffreys Divergence (JD) (Cha, 2007;Kullback and Leibler, 1951). Martin, Moreno, Garrido and Blanco (2015) found that the KLDbased method in the presence of contaminated noise outperformed the L2-based measure in the global localization of mobile robots experiment.…”
Section: Error (Magnitude Of Error): 1 = −mentioning
confidence: 99%
“…The proposal distribution of the particle filter algorithm is regenerated using the KL divergence after containing the latest measurement values, so the new proposal distribution approaches the actual posterior distribution [ 8 ]. Martin et al proposed the Kullback–Leibler divergence-based differential evolution Markov chain filter for global localization for mobile robots in a challenging environment [ 9 ], where the KL-divergence is the basis of the cost function for minimization. The work in [ 3 ] provides a better measurement method for estimating the posterior distribution to apply KL minimization to the prediction and updating of the filtering algorithm, but it only provides the proof of the KL divergence minimization.…”
Section: Related Workmentioning
confidence: 99%