Video analytics and computer vision applications face challenges when using video sequences with low visibility. The visibility of a video sequence is degraded when the sequence is affected by atmospheric interference like rain. Many approaches have been proposed to remove rain streaks from video sequences. Some approaches are based on physical features, and some are based on data-driven (i.e., deep-learning) models. Although the physical features-based approaches have better rain interpretability, the challenges are extracting the appropriate features and fusing them for meaningful rain removal, as the rain streaks and moving objects have dynamic physical characteristics and are difficult to distinguish. Additionally, the outcome of the data-driven models mostly depends on variations relating to the training dataset. It is difficult to include datasets with all possible variations in model training. This paper addresses both issues and proposes a novel hybrid technique where we extract novel physical features and data-driven features and then combine them to create an effective rain-streak removal strategy. The performance of the proposed algorithm has been tested in comparison to several relevant and contemporary methods using benchmark datasets. The experimental result shows that the proposed method outperforms the other methods in terms of subjective, objective, and object detection comparisons for both synthetic and real rain scenarios by removing rain streaks and retaining the moving objects more effectively.
In computer vision applications, the visibility of the video content is crucial to perform analysis for better accuracy. The visibility can be affected by several atmospheric interferences in challenging weather one such interference is the appearance of rain streaks. Recently, rain streak removal has achieved plenty of interest among researchers, as it has some exciting applications such as autonomous cars, intelligent traffic monitoring systems, multimedia, etc. In this paper, we propose a novel and simple method of rain streak removal by combining three novel extracted visual features focusing on the temporal appearance, wide shape and relative location of the rain streak. We called it the TAWL (Temporal Appearance, Width, and Location) method. The proposed TAWL method adaptively uses features from different resolutions and frame rates. Moreover, it progressively processes features from the upcoming frames so that it can remove rain in realtime. Experiments have been conducted using video sequences with both real rain and synthetic rain to compare the performance of the proposed method against the relevant state-of-the-art methods. The experimental results demonstrate that the proposed method outperforms the state-of-the-art methods by removing more rain streaks while keeping other moving regions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.