Transitional cell carcinoma (TCC) is the most common type of bladder cancer. Here we sequenced the exomes of nine individuals with TCC and screened all the somatically mutated genes in a prevalence set of 88 additional individuals with TCC with different tumor stages and grades. In our study, we discovered a variety of genes previously unknown to be mutated in TCC. Notably, we identified genetic aberrations of the chromatin remodeling genes (UTX, MLL-MLL3, CREBBP-EP300, NCOR1, ARID1A and CHD6) in 59% of our 97 subjects with TCC. Of these genes, we showed UTX to be altered substantially more frequently in tumors of low stages and grades, highlighting its potential role in the classification and diagnosis of bladder cancer. Our results provide an overview of the genetic basis of TCC and suggest that aberration of chromatin regulation might be a hallmark of bladder cancer.
Learning-based hashing is a leading approach of approximate nearest neighbor search for large-scale image retrieval. In this paper, we develop a deep supervised hashing method for multi-label image retrieval, in which we propose to learn a binary "mask" map that can identify the approximate locations of objects in an image, so that we use this binary "mask" map to obtain length-limited hash codes which mainly focus on an image's objects but ignore the background. The proposed deep architecture consists of four parts: 1) a convolutional sub-network to generate effective image features; 2) a binary "mask" sub-network to identify image objects' approximate locations; 3) a weighted average pooling operation based on the binary "mask" to obtain feature representations and hash codes that pay most attention to foreground objects but ignore the background; and 4) the combination of a triplet ranking loss designed to preserve relative similarities among images and a cross entropy loss defined on image labels. We conduct comprehensive evaluations on four multi-label image data sets. The results indicate that the proposed hashing method achieves superior performance gains over the state-of-the-art supervised or unsupervised hashing baselines.
Traditional object detection methods always assume both of the training and test data follow the same distribution, but this cannot always be guaranteed in the real world. Domain adaptive methods are proposed to handle this situation. However, existing methods generally ignore the semantic alignment at feature level when they try to align data distributions between source and target domains. In this paper, we propose a novel unsupervised cross-domain object detection method, named Cycle-consistent domain Adaptive Faster RCNN (CA-FRCNN). A couple of Generative Adversarial Nets (GAN) are used to make the features from two domains consistent at both data distribution level and semantic level. Specifically, features from source domain are transformed to the target domain. Then they are aligned with features from target domain. At the same time, target features are handled with similar operations. Furthermore, a cycleconsistent loss is optimized to guarantee that the semantic information is preserved before and after the style translations. In the end, identity module is used to make the feature in source domain equivalent to the reconstructed feature output by the source generator, whose input is from the source domain. For the feature in the target domain, similar identity is required. Experiments on multiple datasets show that our method performs better than previous state-of-the-art methods. INDEX TERMS Domain adaptive object detection, generative adversarial net, semantic information.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.