In order to increase the replicability of scientific work, the scientific community has called for practices designed to increase the transparency of research (McNutt, 2014; Nosek et al., 2015). The validity of a scientific claim depends not on the reputation of those making the claim, the venue in which the claim is made, or the novelty of the result, but rather on the empirical evidence provided by the underlying data and methods. Proper evaluation of the merits of scientific findings requires availability of the methods, materials, and data and the reasoned argument that serve as the basis for the published conclusions (Claerbout and Karrenbach 1992; Donoho et al 2009; Stodden et al 2013; Borwein et al 2013; Munafò et al, 2017). Wide and growing support for these principles (see, for example, signatories to Declaration on Research Assessment, DORA, https://sfdora.org/, and the Transparency and Openness Promotion Guidelines https://cos.io/our-services/top-guidelines/) must be coupled with guidelines to increase open sharing of data and research materials, use of reporting guidelines, preregistration, and replication. We propose that, going forward, authors of all scientific articles disclose the availability and location of all research items, including data, materials, and code, related to their published articles in what we will refer to as a TOP Statement.
Citation metrics and the journal impact factor have been the traditional forms for decades of assessing scholarly impact. These metrics are being challenged in today's digital content environment as no longer being the best methods for research assessment due to issues such as currency and granularity. New alternative metrics -altmetrics -have sprung up, but they are not without their own issues, in particular lack of standardization. The National Information Standards Organization (NISO) launched a project in 2013 to explore, identify, and advance standards and best practices for alternative metrics. A draft white paper summarizing recommendations of the first phase has been issued and will be followed by the second phase of developing one or more of the proposed standards and recommended practices.
This paper describes a survey of BioOne participating publishers that was conducted during the fall of 2003. In that survey, BioOne collected data from 18 not-for-profit publishers on circulation levels, scholarly output in terms of pages and articles produced, revenues, and expenditures. From eight of the publishers, complete profit, loss, and circulation information was gathered, while the remaining 10 publishers only provided circulation data and answered general operations questions. This information was then compiled to compare the business operations of these publishers against industry-standard benchmarks to assess their business practices and to examine the effect of recent trends on publishers' revenue streams and costs. The paper also explores these data in relation to shifting to a publishing model based not on print but electronic subscriptions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.