a b s t r a c tUnderwater scenes captured by cameras are plagued with poor contrast and a spectral distortion, which are the result of the scattering and absorptive properties of water. In this paper we present a novel dehazing method that improves visibility in images and videos by detecting and segmenting image regions that contain only water. The colour of these regions, which we refer to as pure haze regions, is similar to the haze that is removed during the dehazing process. Moreover, we propose a semantic white balancing approach for illuminant estimation that uses the dominant colour of the water to address the spectral distortion present in underwater scenes. To validate the results of our method and compare them to those obtained with state-of-the-art approaches, we perform extensive subjective evaluation tests using images captured in a variety of water types and underwater videos captured onboard an underwater vehicle.
Accurate correction of the corrupting effects of the atmosphere and the water's surface are essential in order to obtain the optical, biological and biogeochemical properties of the water from satellite-based multi-and hyper-spectral sensors. The major challenges now for atmospheric correction are the conditions of turbid coastal and inland waters and areas in which there are strongly-absorbing aerosols. Here, we outline how these issues can be addressed, with a focus on the potential of new sensor technologies and the opportunities for the development of novel algorithms and aerosol models. We review hardware developments, which will provide qualitative and quantitative increases in spectral, spatial, radiometric and temporal data of the Earth, as well as measurements from other sources, such as the Aerosol Robotic Network for Ocean Color (AERONET-OC) stations, bio-optical sensors on Argo (Bio-Argo) floats and polarimeters. We provide an overview of the state of the art in atmospheric correction algorithms, highlight recent advances and discuss the possible potential for hyperspectral data to address the current challenges.
Within the worldwide diving community, underwater photography is becoming increasingly popular. However, the marine environment presents certain challenges for image capture, with resulting imagery often suffering from colour distortions, low contrast and blurring. As a result, image enhancement software is used not only to enhance the imagery aesthetically, but also to address these degradations. Although feature-rich image enhancement software products are available, little is known about the user experience of underwater photographers when interacting with such tools. To address this gap, we conducted an online questionnaire to better understand what software tools are being used, and face-to-face interviews to investigate the characteristics of the image enhancement user experience for underwater photographers. We analysed the interview transcripts using the pragmatic and hedonic categories from the frameworks of Hassenzahl (Funology, Kluwer Academic Publishers, Dordrecht, pp 31–42, 2003; Funology 2, Springer, pp 301–313, 2018) for positive and negative user experience. Our results reveal a moderately negative experience overall for both pragmatic and hedonic categories. We draw some insights from the findings and make recommendations for improving the user experience for underwater photographers using image enhancement tools.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.