Plagiarism may take place in any scientific journals despite currently employed anti-plagiarism tools. The absence of widely acceptable definitions of research misconduct and reliance solely on similarity checks do not allow journal editors to prevent most complex cases of recycling of scientific information and wasteful, or ‘predatory,’ publishing. This article analyses Scopus-based publication activity and evidence on poor writing, lack of related training, emerging anti-plagiarism strategies, and new forms of massive wasting of resources by publishing largely recycled items, which evade the ‘red flags’ of similarity checks.In some non-Anglophone countries ‘copy-and-paste’ writing still plagues pre- and postgraduate education. Poor research management, absence of courses on publication ethics, and limited access to quality sources confound plagiarism as a cross-cultural and multidisciplinary phenomenon. Over the past decade, the advent of anti-plagiarism software checks has helped uncover elementary forms of textual recycling across journals. But such a tool alone proves inefficient for preventing complex forms of plagiarism. Recent mass retractions of plagiarized articles by reputable open-access journals point to critical deficiencies of current anti-plagiarism software that do not recognize manipulative paraphrasing and editing. Manipulative editing also finds its way to predatory journals, ignoring the adherence to publication ethics and accommodating nonsense plagiarized items. The evolving preventive strategies are increasingly relying on intelligent (semantic) digital technologies, comprehensively evaluating texts, keywords, graphics, and reference lists. It is the right time to enforce adherence to global editorial guidance and implement a comprehensive anti-plagiarism strategy by helping all stakeholders of scholarly communication.
Numerous quantitative indicators are currently available for evaluating research productivity. No single metric is suitable for comprehensive evaluation of the author-level impact. The choice of particular metrics depends on the purpose and context of the evaluation. The aim of this article is to overview some of the widely employed author impact metrics and highlight perspectives of their optimal use. The h-index is one of the most popular metrics for research evaluation, which is easy to calculate and understandable for non-experts. It is automatically displayed on researcher and author profiles on citation databases such as Scopus and Web of Science. Its main advantage relates to the combined approach to the quantification of publication and citation counts. This index is increasingly cited globally. Being an appropriate indicator of publication and citation activity of highly productive and successfully promoted authors, the h-index has been criticized primarily for disadvantaging early career researchers and authors with a few indexed publications. Numerous variants of the index have been proposed to overcome its limitations. Alternative metrics have also emerged to highlight ‘societal impact.’ However, each of these traditional and alternative metrics has its own drawbacks, necessitating careful analyses of the context of social attention and value of publication and citation sets. Perspectives of the optimal use of researcher and author metrics is dependent on evaluation purposes and compounded by information sourced from various global, national, and specialist bibliographic databases.
Citations to scholarly items are building bricks for multidisciplinary science communication. Citation analyses are currently influencing individual career advancement and ranking of academic and research institutions worldwide. This article overviews the involvement of scientific authors, reviewers, editors, publishers, indexers, and learned associations in the citing and referencing to preserve the integrity of science communication. Authors are responsible for thorough bibliographic searches to select relevant references for their articles, comprehend main points, and cite them in an ethical way. Reviewers and editors may perform additional searches and recommend missing essential references. Publishers, in turn, are in a position to instruct their authors over the citations and references, provide tools for validation of references, and open access to bibliographies. Publicly available reference lists bear important information about the novelty and relatedness of the scholarly items with the published literature. Few editorial associations have dealt with the issue of citations and properly managed references. As a prime example, the International Committee of Medical Journal Editors (ICMJE) issued in December 2014 an updated set of recommendations on the need for citing primary literature and avoiding unethical references, which are applicable to the global scientific community. With the exponential growth of literature and related references, it is critically important to define functions of all stakeholders of science communication in curbing the issue of irrational and unethical citations and thereby improve the quality and indexability of scholarly journals.
Authorship problems are deep-rooted in the field of science communication. Some of these relate to lack of specific journal instructions. For decades, experts in journal editing and publishing have been exploring the authorship criteria and contributions deserving either co-authorship or acknowledgment. The issue of inconsistencies of listing and abbreviating author names has come to the fore lately. There are reports on the difficulties of figuring out Chinese surnames and given names of South Indians in scholarly articles. However, it seems that problems with correct listing and abbreviating author names are global. This article presents an example of swapping second (father’s) name with surname in a ‘predatory’ journal, where numerous instances of incorrectly identifying and crediting authors passed unnoticed for the journal editors, and no correction has been published. Possible solutions are discussed in relation to identifying author profiles and adjusting editorial policies to the emerging problems. Correcting mistakes with author names post-publication and integrating with the Open Researcher and Contributor ID (ORCID) platform are among them.
Specialist bibliographic databases offer essential online tools for researchers and authors who work on specific subjects and perform comprehensive and systematic syntheses of evidence. This article presents examples of the established specialist databases, which may be of interest to those engaged in multidisciplinary science communication. Access to most specialist databases is through subscription schemes and membership in professional associations. Several aggregators of information and database vendors, such as EBSCOhost and ProQuest, facilitate advanced searches supported by specialist keyword thesauri. Searches of items through specialist databases are complementary to those through multidisciplinary research platforms, such as PubMed, Web of Science, and Google Scholar. Familiarizing with the functional characteristics of biomedical and nonbiomedical bibliographic search tools is mandatory for researchers, authors, editors, and publishers. The database users are offered updates of the indexed journal lists, abstracts, author profiles, and links to other metadata. Editors and publishers may find particularly useful source selection criteria and apply for coverage of their peer-reviewed journals and grey literature sources. These criteria are aimed at accepting relevant sources with established editorial policies and quality controls.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.