Single-image super-resolution technology has been widely studied in various applications to improve the quality and resolution of degraded images acquired from noise-sensitive low-resolution sensors. As most studies on single-image super-resolution focused on the development of deep learning networks operating on high-performance GPUs, this study proposed an efficient and lightweight super-resolution network that enables real-time performance on mobile devices. To replace the relatively slow element-wise addition layer on mobile devices, we introduced a skip connection layer by directly concatenating a lowresolution input image with an intermediate feature map. In addition, we introduced weighted clipping to reduce the quantization errors commonly encountered during float-to-int8 model conversion. Moreover, a reparameterization method was selectively applied without increasing the cost in terms of inference time and number of parameters. Based on the contributions, the proposed network has been recognized as the best solution in Mobile AI & AIM 2022 Real-Time Single-Image Super-Resolution Challenge with PSNR of 30.03 dB and NPU runtime of 19.20 ms.
The conventional embedded wavelet image coder exploiting the adjacent neighbors is limited to efficiently compress the sign coefficients, since the wavelet coefficients are highly correlated along the dominant image features, such as edges or contours. To solve the problem, in this paper, we propose the direction-adaptive sign context modeling by adaptively exploiting the neighbors suitable to the dominant image features. Experimental results show that the sign coding based on the proposed context modeling reduces the sign bits up to 5.5% compared to the conventional sign coding method.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.