SUMMARYSox6 has been proposed to play a conserved role in vertebrate skeletal muscle fibre type specification. In zebrafish, sox6 transcription is repressed in slow-twitch progenitors by the Prdm1a transcription factor. Here we identify sox6 cis-regulatory sequences that drive fast-twitch-specific expression in a Prdm1a-dependent manner. We show that sox6 transcription subsequently becomes derepressed in slow-twitch fibres, whereas Sox6 protein remains restricted to fast-twitch fibres. We find that translational repression of sox6 is mediated by miR-499, the slow-twitch-specific expression of which is in turn controlled by Prdm1a, forming a regulatory loop that initiates and maintains the slow-twitch muscle lineage.
This paper introduces mass estimation-a base modelling mechanism that can be employed to solve various tasks in machine learning. We present the theoretical basis of mass and efficient methods to estimate mass. We show that mass estimation solves problems effectively in tasks such as information retrieval, regression and anomaly detection. The models, which use mass in these three tasks, perform at least as well as and often better than eight state-of-the-art methods in terms of task-specific performance measures. In addition, mass estimation has constant time and space complexities.
This paper introduces a new ensemble approach, Feature-Subspace Aggregating (Feating), which builds local models instead of global models. Feating is a generic ensemble approach that can enhance the predictive performance of both stable and unstable learners. In contrast, most existing ensemble approaches can improve the predictive performance of unstable learners only. Our analysis shows that the new approach reduces the execution time to generate a model in an ensemble through an increased level of localisation in Feating. Our empirical evaluation shows that Feating performs significantly better than Boosting, Random Subspace and Bagging in terms of predictive accuracy, when a stable learner SVM is used as the base learner. The speed up achieved by Feating makes feasible SVM ensembles that would otherwise be infeasible for large data sets. When SVM is the preferred base learner, we show that Feating SVM performs better than Boosting decision trees and Random Forests. We further demonstrate that Feating also substantially reduces the error of another stable learner, k-nearest neighbour, and an unstable learner, decision tree.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations鈥揷itations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.