Images acquired under sand-dust weather conditions are severely degraded, with low contrast and severe color shift. The reason is that, due to the influence of sand-dust particles, light is scattered and absorbed, resulting in a blurred image and low contrast; the color shift is caused by the rapid attenuation of blue light. Therefore, to solve the problem of color shift and poor visibility in sand-dust images, this paper proposes a sand-dust image restoration method based on reversing the blue channel prior (RBCP). Under the influence of the blue channel, the dark channel prior (DCP) method will fail. Therefore, the method first reverses the blue channel of the sand-dust image and uses the dark channel prior method, which we call RBCP, and then, RBCP is used to estimate the atmospheric light and transmission map and recover the sand-dust image. The restored image shows significantly improved visibility. When estimating the transmission map, a guiding filter is used to improve the coarse transmission map, and a tolerance mechanism is introduced to modify the transmission map of bright areas in the sky to solve the problem of distortion in the sky. Finally, combined with the gray world, an adaptive color adjustment factor is introduced into the restoration model to remove the color shift. Experimental results via qualitative and quantitative evaluation demonstrate that the proposed method can effectively recover clear sand-dust images and produce results superior to those of other state-of-the-art methods.
The images captured in sand-dust weather have the characteristics of color deviation and low visibility, which seriously affect computer vision systems. To solve the above problems, we propose a fast and effective algorithm to enhance the images captured in sand-dust weather conditions. First, we compensate for the loss value in the blue channel. Then, white balancing technology is used to correct the color of the sand-dust-degraded image. Finally, guided image filtering is used to enhance the image contrast and edge accuracy, and an adaptive method is used to calculate the magnification factor of the detail layer to enhance the image detail information. The experimental results on a large number of sand-dust-degraded images show that the method can effectively recover the fading characteristics of sand-dust-degraded images in a short time and improve the clarity of the images. Experimental results via qualitative and quantitative evaluations demonstrate that the proposed method can significantly improve the images captured during sand-dust weather conditions, and the results are better than those of other methods. INDEX TERMS Sand-dust-degraded image, blue channel compensation, color correction, guided image filtering.
Due to the scattering and absorption of light by sandstorm particles, images taken in sandstorm environments have low contrast, color distortion and other degradation problems, which seriously affect outdoor computer vision systems. In this paper, we propose a novel method to enhance images captured in sandstorm weather conditions. First, the method builds on the blending of two images that are directly derived from the original degraded image. Second, we use multilayer decomposition technology to enhance image details and use a blue channel and white balancing technology to restore image contrast and chromaticity. Third, we associate weight maps to improve image edge contrast. Finally, the Laplacian pyramid fusion method is used to obtain the fusion results of the sandstorm-free color correction image. The experimental results demonstrate that the proposed method can effectively restore the fade characteristics of sandstorm-degraded images and improve the clarity of the images. Experimental results via subjective and objective evaluations demonstrate that the proposed method can significantly improve images captured during sandstorm weather conditions, and the results are better than other methods.INDEX TERMS Blue channel, image fusion, multilayer decomposition, sandstorm-degraded image.
In recent years, trackers based on correlation filters have attracted more and more attention due to the impressive tracking accuracy and real-time performance. However, in real scenarios, the tracking results are often been interfered with by the occlusion, illumination variation, appearance variation and background clutter. In order to find a tracker with better tracking performances, this paper proposed a multi-information fusion correlation filter tracker, which uses channel and spatial reliabilities and time regularization information on samples for filter training, and which not only extends the target search areas but also has a stronger ability to track the targets with significant appearance variations. Thus, results from extensive experiments conducted on OTB100, VOT2016, TC128, and UAV123 data sets show that our tracker with only directional gradient histogram (HOG) and color name (CN) features, performs favorably against the state-of-the-art trackers in terms of tracking precision, tracking success rate, tracking accuracy, and A-R rank. INDEX TERMS Object tracking, correlation filter, channel reliability, spatial reliability, time regularization.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.