Human matching between different fields of view is a difficult problem in intelligent video surveillance; whereas fusing multiple features has become a strong tool to solve it. In order to guide the fusion scheme, it is necessary to evaluate the matching performance of these features. In this paper, four typical features are chosen for the evaluation. They are the Color Histogram, UV Chromaticity, Major Color Spectrum Histogram, and Scale-Invariant Features (SIFT). Quantities of video data are collected to test their general accuracy, robustness, and real-time applicability. The robustness is measured under the conditions of illumination changes, Gaussian and salt noises, foreground errors, resolution changes, and camera angle differences. The experimental results show that the four features bear distinctive performances under the different conditions, which will provide important references for the feature fusion methods.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.