The field of computational photography, and in particular the design and implementation of coded apertures, has yielded impressive results in the last years. In this paper we introduce perceptually optimized coded apertures for defocused deblurring. We obtain near‐optimal apertures by means of optimization, with a novel evaluation function that includes two existing image quality perceptual metrics. These metrics favour results where errors in the final deblurred images will not be perceived by a human observer. Our work improves the results obtained with a similar approach that only takes into account the L2 metric in the evaluation function.
Computational perception techniques are gaining increasing importance in all different steps of the imaging pipeline, from data capture to final display. The goal of these techniques is usually the same: to minimize visible errors in the final image. We present a method to compute coded apertures for defocus deblurring that takes into account human perception to obtain the optimal aperture pattern. We pose the problem of finding the optimal aperture as an optimization, and -to our knowledge- propose the first algorithm that makes use of perceptual quality metrics in its objective function. We explore the performance of different quality metrics for the design of such coded apertures, including the well-established SSIM, the state-of-the-art HDR-VDP-2, which features a comprehensive model of the HVS, as well as the L2 norm, previously used in related works.Our results outperform both conventional circular apertures and a previously published aperture pattern specifically designed for defocus deblurring [1], both in simulated and real scenarios (see Figures 1 and 2). The best apertures are obtained when a combination of the three metrics is used in the objective function. This work has been recently accepted for publication at the Computer Graphics Forum [2].
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.