Nowadays, there is an increasing demand for images with high definition and fine textures, but images captured in natural scenes usually suffer from complicated blurry artifacts, caused mostly by object motion or camera shaking. Since these annoying artifacts greatly decrease image visual quality, deblurring algorithms have been proposed from various aspects. However, most energy-optimization-based algorithms rely heavily on blur kernel priors, and some learning-based methods either adopt pixel-wise loss function or ignore global structural information. Therefore, we propose an image deblurring algorithm based on a recurrent conditional generative adversarial network (RCGAN), in which the scale-recurrent generator extracts sequence spatio-temporal features and reconstructs sharp images in a coarse-to-fine scheme. To thoroughly evaluate the global and local generator performance, we further propose a receptive field recurrent discriminator. Besides, the discriminator takes blurry images as conditions, which helps to differentiate reconstructed images from real sharp ones. Last but not least, since the gradients are vanishing when training the generator with the output of the discriminator, a progressive loss function is proposed to enhance the gradients in back propagation and to take full advantage of discriminative features. Extensive experiments prove the superiority of RCGAN over state-of-the-art algorithms both qualitatively and quantitatively.INDEX TERMS Image deblurring, conditional generative adversarial network, receptive field recurrent, coarse-to-fine.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.