The evaluation of performance in scientific research at any level - whether at the individual, institutional, research council or country level - is not easy. Traditionally, research evaluation at the individual and institutional levels has depended largely on peer opinion, but with the rapid growth of science over the last century and the availability of databases and scientometric techniques, quantitative indicators have gained importance. Both peer review and metrics are subject to flaws, more so in India because of the way they are used. Government agencies, funding bodies and academic and research institutions in India suffer from the impact factor and h-index syndrome. The uninformed use of indicators such as average and cumulative impact factors and the arbitrary criteria stipulated by agencies such as the University Grants Commission, Indian Council of Medical Research and the Medical Council of India for selection and promotion of faculty have made it difficult to distinguish good science from the bad and the indifferent. The exaggerated importance given by these agencies to the number of publications, irrespective of what they report, has led to an ethical crisis in scholarly communication and the reward system in science. These agencies seem to be unconcerned about the proliferation of predatory journals and conferences. After giving examples of the bizarre use of indicators and arbitrary recruitment and evaluation practices in India, we summarise the merits of peer review and quantitative indicators and the evaluation practices followed elsewhere.
ORCID, the Open Researcher and Contributor ID, is a nonprofit, community-driven effort to create and maintain a registry of unique researcher identifiers and a transparent method of linking research activities and outputs to these identifiers. Together with other persistent identifiers for scholarly works such as digital object identifiers (DOIs) and identifiers for organizations, ORCID makes research more discoverable. It helps ensure that one's grants, publications and outputs are correctly attributed. It helps the research community not just in aggregating publications, but in every stage of research, viz. publishing, reviewing, profiling, metrics, accessing and archiving. Funding agencies in Austria, Australia, Denmark, Portugal, Sweden and the UK, and the world's leading scholarly publishers and associations have integrated their systems with ORCID registry. Among the BRICS countries, China and South Africa are adopting ORCID avidly. India is yet to make a beginning. If research councils and funding agencies in India require researchers to adopt ORCID and link ORCID iDs to funding as well as tracking performance, it will help them keep track of the workflow. Journal editors can also keep track of contributions made by different authors and work assigned to different reviewers through their ORCID iDs.
Against the backdrop of comments on chemistry research in
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.