Video decolorization is to filter out the color information while preserving the perceivable content in the video as much and correct as possible. Existing methods mainly apply image decolorization strategies on videos, which may be slow and produce incoherent results. In this paper, we propose a video decolorization framework that considers frame coherence and saves decolorization time by referring to the decolorized frames. It has three main contributions. First, we define decolorization proximity to measure the similarity of adjacent frames. Second, we propose three decolorization strategies for frames with low, medium, and high proximities, to preserve the quality of these three types of frames. Third, we propose a novel decolorization Gaussian mixture model to classify the frames and assign appropriate decolorization strategies to them based on their decolorization proximity. To evaluate our results, we measure them from three aspects: 1) qualitative; 2) quantitative; and 3) user study. We apply color contrast preserving ratio and C2G-SSIM to evaluate the quality of single frame decolorization. We propose a novel temporal coherence degree metric to evaluate the temporal coherence of the decolorized video. Compared with current methods, the proposed approach shows all around better performance in time efficiency, temporal coherence, and quality preservation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.