We present an updated analysis of systematics in the Atacama Large Millimeter/submillimeter Array (ALMA) proposal ranks to include the last two ALMA cycles, when significant changes were introduced in the proposal review process. In Cycle 7, the investigator list on the proposal cover sheet was randomized such that the reviewers were aware of the overall proposal team but did not know the identity of the principal investigator (PI). In Cycle 8, ALMA adopted distributed peer review for most proposals and implemented dual-anonymous review for all proposals, in which the identity of the proposal team was not revealed to the reviewers. The most significant change in the systematics in Cycles 7 and 8 compared to previous cycles is related to the experience of PIs in submitting ALMA proposals. PIs that submit a proposal every cycle tend to have ranks that are consistent with average in Cycles 7 and 8, whereas previously they had the best overall ranks. Also, PIs who submitted a proposal for the second time show improved ranks over previous cycles. These results suggest some biases related to the relative prominence of the PI have been present in the ALMA review process. Systematics related to regional affiliation remain largely unchanged in that PIs from Chile, East Asia, and non-ALMA regions tend to have poorer overall ranks than PIs from Europe and North America. The systematics of how one region ranks proposals from another region are also investigated. No significant differences in the overall ranks based on gender of the PI are observed.
In response to the challenges presented by high reviewer workloads in traditional panel reviews and increasing numbers of submitted proposals, ALMA implemented distributed peer review to assess the majority of proposals submitted to the Cycle 8 Main Call. In this paper, we present an analysis of this review process. Over 1000 reviewers participated in the process to review 1497 proposals, making it the largest implementation of distributed peer review to date in astronomy, and marking the first time this process has been used to award the majority of observing time at an observatory. We describe the process to assign proposals to reviewers, analyze the nearly 15,000 ranks and comments submitted by reviewers to identify any trends and systematics, and gather feedback on the process from reviewers and Principal Investigators (PIs) through surveys. Approximately 90% of the proposal assignments were aligned with the expertise of the reviewer, as measured both by the expertise keywords provided by the reviewers and the reviewers' self-assessment of their expertise on their assigned proposals. PIs rated 73% of the individual review comments as helpful, and even though the reviewers had a broad range of experience levels, PIs rated the quality of the comments received from students and senior researchers similarly. The primary concerns raised by PIs were the quality of some reviewer comments and high dispersions in the ranks. The ranks and comments are correlated with various demographics to identify the main areas in which the review process can be improved in future cycles.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.