The Super-Resolution Generative Adversarial Network (SR-GAN) [1] is a seminal work that is capable of generating realistic textures during single image super-resolution. However, the hallucinated details are often accompanied with unpleasant artifacts. To further enhance the visual quality, we thoroughly study three key components of SRGANnetwork architecture, adversarial loss and perceptual loss, and improve each of them to derive an Enhanced SRGAN (ESRGAN). In particular, we introduce the Residual-in-Residual Dense Block (RRDB) without batch normalization as the basic network building unit. Moreover, we borrow the idea from relativistic GAN [2] to let the discriminator predict relative realness instead of the absolute value. Finally, we improve the perceptual loss by using the features before activation, which could provide stronger supervision for brightness consistency and texture recovery. Benefiting from these improvements, the proposed ESRGAN achieves consistently better visual quality with more realistic and natural textures than SRGAN and won the first place in the PIRM2018-SR Challenge 1 [3]. The code is available at https://github.com/xinntao/ESRGAN.
Video restoration tasks, including super-resolution, deblurring, etc, are drawing increasing attention in the computer vision community. A challenging benchmark named REDS is released in the NTIRE19 Challenge. This new benchmark challenges existing methods from two aspects:(1) how to align multiple frames given large motions, and (2) how to effectively fuse different frames with diverse motion and blur. In this work, we propose a novel Video Restoration framework with Enhanced Deformable convolutions, termed EDVR, to address these challenges. First, to handle large motions, we devise a Pyramid, Cascading and Deformable (PCD) alignment module, in which frame alignment is done at the feature level using deformable convolutions in a coarse-to-fine manner. Second, we propose a Temporal and Spatial Attention (TSA) fusion module, in which attention is applied both temporally and spatially, so as to emphasize important features for subsequent restoration. Thanks to these modules, our EDVR wins the champions and outperforms the second place by a large margin in all four tracks in the NTIRE19 video restoration and enhancement challenges. EDVR also demonstrates superior performance to state-of-the-art published methods on video super-resolution and deblurring. The code is available at https://github.com/xinntao/EDVR.
Highlights d We build the genomic and transcriptomic landscape of 465 primary TNBCs d Chinese TNBC cases demonstrate more PIK3CA mutations and LAR subtype d Transcriptomic data classify TNBCs into four subtypes d Multi-omics profiling identifies potential targets within specific TNBC subtypes
Programmed cell death-1 (PD-1) targeted therapies enhance T cell responses and show efficacy in multiple cancers but the role of costimulatory molecules in this T cell rescue remains elusive. Here we demonstrate that the CD28/B7 costimulatory pathway is essential for effective PD-1 therapy during chronic viral infection of mice. Conditional gene deletion showed a cell-intrinsic requirement of CD28 for CD8 T cell proliferation after PD-1 blockade. B7-costimulation was also necessary for effective PD-1 therapy in tumor-bearing mice. In addition, we found that CD8 T cells proliferating in blood after PD-1 therapy of lung cancer patients were predominantly CD28 positive. Taken together these data demonstrate CD28-costimulation requirement for CD8 T cell rescue and suggest an important role for CD28/B7 pathway in PD-1 therapy of cancer patients.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.