In particle filter-based visual trackers, dynamic velocity components are typically incorporated into the state update equations. In these cases, there is a risk that the uncertainty in the model update stage can become amplified in unexpected and undesirable ways, leading to erroneous behavior of the tracker. Moreover, the use of a weak appearance model can make the estimates provided by the particle filter inaccurate. To deal with this problem, we propose a continuously adaptive approach to estimating uncertainty in the particle filter, one that balances the uncertainty in its static and dynamic elements. We provide quantitative performance evaluation of the resulting particle filter tracker on a set of ten video sequences. Results are reported in terms of a metric that can be used to objectively evaluate the performance of visual trackers. This metric is used to compare our modified particle filter tracker and the continuously adaptive mean shift tracker. Results show that the performance of the particle filter is significantly improved through adaptive parameter estimation, particularly in cases of occlusion and erratic, nonlinear target motion.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.