The highly heavy metal resistant strain Cupriavidus metallidurans BS1 was isolated from the Zijin gold-copper mine in China. This was of particular interest since the extensively studied, closely related strain, C. metallidurans CH34 was shown to not be only highly heavy metal resistant but also able to reduce metal complexes and biomineralizing them into metallic nanoparticles including gold nanoparticles. After isolation, C. metallidurans BS1 was characterized and complete genome sequenced using PacBio and compared to CH34. Many heavy metal resistance determinants were identified and shown to have wide-ranging similarities to those of CH34. However, both BS1 and CH34 displayed extensive genome plasticity, probably responsible for significant differences between those strains. BS1 was shown to contain three prophages, not present in CH34, that appear intact and might be responsible for shifting major heavy metal resistance determinants from plasmid to chromid (CHR2) in C. metallidurans BS1. Surprisingly, the single plasmid-pBS1 (364.4 kbp) of BS1 contains only a single heavy metal resistance determinant, the czc determinant representing RND-type efflux system conferring resistance to cobalt, zinc and cadmium, shown here to be highly similar to that determinant located on pMOL30 in C. metallidurans CH34. However, in BS1 another homologous czc determinant was identified on the chromid, most similar to the czc determinant from pMOL30 in CH34. Other heavy metal resistance determinants such as cnr and chr determinants, located on megaplasmid pMOL28 in CH34, were shown to be adjacent to the czc determinant on chromid (CHR2) in BS1. Additionally, other heavy metal resistance determinants such as pbr, cop, sil, and ars were located on the chromid (CHR2) and not on pBS1 in BS1. A diverse range of genomic rearrangements occurred in this strain, isolated from a habitat of constant exposure to high concentrations of copper, gold and other heavy metals. In contrast, the
Learning to improve AUC performance is an important topic in machine learning. However, AUC maximization algorithms may decrease generalization performance due to the noisy data. Self-paced learning is an effective method for handling noisy data. However, existing self-paced learning methods are limited to pointwise learning, while AUC maximization is a pairwise learning problem. To solve this challenging problem, we innovatively propose a balanced self-paced AUC maximization algorithm (BSPAUC). Specifically, we first provide a statistical objective for self-paced AUC. Based on this, we propose our self-paced AUC maximization formulation, where a novel balanced self-paced regularization term is embedded to ensure that the selected positive and negative samples have proper proportions. Specially, the sub-problem with respect to all weight variables may be non-convex in our formulation, while the one is normally convex in existing self-paced problems. To address this, we propose a doubly cyclic block coordinate descent method. More importantly, we prove that the sub-problem with respect to all weight variables converges to a stationary point on the basis of closed-form solutions, and our BSPAUC converges to a stationary point of our fixed optimization objective under a mild assumption. Considering both the deep learning and kernel-based implementations, experimental results on several large-scale datasets demonstrate that our BSPAUC has a better generalization performance than existing state-of-the-art AUC maximization methods.
Deep Metric Learning (DML) is a group of techniques that aim to measure the similarity between objects through the neural network. Although the number of DML methods has rapidly increased in recent years, most previous studies cannot effectively handle noisy data, which commonly exists in practical applications and often leads to serious performance deterioration. To overcome this limitation, in this paper, we build a connection between noisy samples and hard samples in the framework of self-paced learning, and propose a Balanced Self-Paced Metric Learning (BSPML) algorithm with a denoising multi-similarity formulation, where noisy samples are treated as extremely hard samples and adaptively excluded from the model training by sample weighting. Especially, due to the pairwise relationship and a new balance regularization term, the sub-problem w.r.t. sample weights is a nonconvex quadratic function. To efficiently solve this nonconvex quadratic problem, we propose a doubly stochastic projection coordinate gradient algorithm. Importantly, we theoretically prove the convergence not only for the doubly stochastic projection coordinate gradient algorithm, but also for our BSPML algorithm. Experimental results on several standard data sets demonstrate that our BSPML algorithm has better generalization ability and robustness than the state-of-the-art robust DML approaches.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.