Replicability is an important feature of scientific research, but aspects of contemporary research culture, such as an emphasis on novelty, can make replicability seem less important than it should be. The Reproducibility Project: Cancer Biology was set up to provide evidence about the replicability of preclinical research in cancer biology by repeating selected experiments from high-impact papers. A total of 50 experiments from 23 papers were repeated, generating data about the replicability of a total of 158 effects. Most of the original effects were positive effects (136), with the rest being null effects (22). A majority of the original effect sizes were reported as numerical values (117), with the rest being reported as representative images (41). We employed seven methods to assess replicability, and some of these methods were not suitable for all the effects in our sample. One method compared effect sizes: for positive effects, the median effect size in the replications was 85% smaller than the median effect size in the original experiments, and 92% of replication effect sizes were smaller than the original. The other methods were binary – the replication was either a success or a failure – and five of these methods could be used to assess both positive and null effects when effect sizes were reported as numerical values. For positive effects, 40% of replications (39/97) succeeded according to three or more of these five methods, and for null effects 80% of replications (12/15) were successful on this basis; combining positive and null effects, the success rate was 46% (51/112). A successful replication does not definitively confirm an original finding or its theoretical interpretation. Equally, a failure to replicate does not disconfirm a finding, but it does suggest that additional investigation is needed to establish its reliability.
Data Availability Statement: All data and materials are publicly accessible at https://osf.io/rfgdw/. Also, we preregistered our study design and analysis plan. You can find the preregistration at https://osf.io/ipkea/.Funding: The authors received no specific funding for this work.Competing Interests: Brian Nosek created the badges to acknowledge open practices, and Brian Nosek and Mallory Kidwell are on a committee maintaining the badges. The badges and specifications for earning them are CC0 licensed with and in four comparison journals. We report an increase in reported data sharing of more than an order of magnitude from baseline in Psychological Science, as well as an increase in reported materials sharing, although to a weaker degree. Moreover, we show that reportedly available data and materials were more accessible, correct, usable, and complete when badges were earned. We demonstrate that badges are effective incentives that improve the openness, accessibility, and persistence of data and materials that underlie scientific research.
We conducted the Reproducibility Project: Cancer Biology to investigate the replicability of preclinical research in cancer biology. The initial aim of the project was to repeat 193 experiments from 53 high-impact papers, using an approach in which the experimental protocols and plans for data analysis had to be peer reviewed and accepted for publication before experimental work could begin. However, the various barriers and challenges we encountered while designing and conducting the experiments meant that we were only able to repeat 50 experiments from 23 papers. Here we report these barriers and challenges. First, many original papers failed to report key descriptive and inferential statistics: the data needed to compute effect sizes and conduct power analyses was publicly accessible for just 4 of 193 experiments. Moreover, despite contacting the authors of the original papers, we were unable to obtain these data for 68% of the experiments. Second, none of the 193 experiments were described in sufficient detail in the original paper to enable us to design protocols to repeat the experiments, so we had to seek clarifications from the original authors. While authors were extremely or very helpful for 41% of experiments, they were minimally helpful for 9% of experiments, and not at all helpful (or did not respond to us) for 32% of experiments. Third, once experimental work started, 67% of the peer-reviewed protocols required modifications to complete the research and just 41% of those modifications could be implemented. Cumulatively, these three factors limited the number of experiments that could be repeated. This experience draws attention to a basic and fundamental concern about replication – it is hard to assess whether reported findings are credible.
p90 ribosomal S6 kinase (RSK) is an important downstream effector of mitogen-activated protein kinase, but its biological functions are not well understood. We have now identified the first small-molecule, RSK-specific inhibitor, which we isolated from the tropical plant Forsteronia refracta. We have named this novel inhibitor SL0101. SL0101 shows remarkable specificity for RSK. The major determinant of SL0101-binding specificity is the unique ATP-interacting sequence in the amino-terminal kinase domain of RSK. SL0101 inhibits proliferation of the human breast cancer cell line MCF-7, producing a cell cycle block in G1 phase with an efficacy paralleling its ability to inhibit RSK in intact cells. RNA interference of RSK expression confirmed that RSK regulates MCF-7 proliferation. Interestingly, SL0101 does not alter proliferation of a normal human breast cell line MCF-10A, although SL0101 inhibits RSK in these cells. We show that RSK is overexpressed in ∼50% of human breast cancer tissue samples, suggesting that regulation of RSK has been compromised. Thus, we show that RSK has an unexpected role in proliferation of transformed cells and may be a useful new target for chemotherapeutic agents. SL0101 will provide a powerful new tool to dissect the molecular functions of RSK in cancer cells.
he two of us have spent years coordinating replications of published studies. The most consistent outcomes are confusion and disagreement, particularly when outcomes seem to contradict original findings. We saw this in the Reproducibility Project: Cancer Biology, in which we managed attempts to replicate experiments from high-impact papers 1. Among the 50 replication experiments completed (from 23 papers), one required transplanting leukaemia cells into immunocompromised mice and letting the cells grow before administering a potential treatment.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.