Proceedings of the Web Conference 2021 2021
DOI: 10.1145/3442381.3450088
|View full text |Cite
|
Sign up to set email alerts
|

Peer Grading the Peer Reviews: A Dual-Role Approach for Lightening the Scholarly Paper Review Process

Abstract: Scientific peer review is pivotal to maintain quality standards for academic publication. The effectiveness of the reviewing process is currently being challenged by the rapid increase of paper submissions in various conferences. Those venues need to recruit a large number of reviewers of different levels of expertise and background. The submitted reviews often do not meet the conformity standards of the conferences. Such a situation poses an ever-bigger burden on the meta-reviewers when trying to reach a fina… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(11 citation statements)
references
References 40 publications
0
11
0
Order By: Relevance
“…To evaluate the Rcon (see next section), we need to grade these reviews. Arous et al (2021) have conducted a large-scale crowdsourcing study for review grading, in which they asked participants to provide ratings for each of the eight conformity criteria for each of the reviews. However, they only selected a subset of the most informative reviews from the ICLR 2018 (30% reviews) and 2019 (5% reviews) data sets for grading.…”
Section: Data Setmentioning
confidence: 99%
“…To evaluate the Rcon (see next section), we need to grade these reviews. Arous et al (2021) have conducted a large-scale crowdsourcing study for review grading, in which they asked participants to provide ratings for each of the eight conformity criteria for each of the reviews. However, they only selected a subset of the most informative reviews from the ICLR 2018 (30% reviews) and 2019 (5% reviews) data sets for grading.…”
Section: Data Setmentioning
confidence: 99%
“…David Tran et al quantified reproducibility in the review process by Monte-Carlo simulations [2]. Ines Arous et al proposed a Bayesian framework that integrates a machine learning model with peer grading to assess the conformity of scholarly reviews [5]. To evaluate the generated review, Yuan et al proposed a variety of diagnostic criteria for review quality, including review aspect coverage and informativeness [4].…”
Section: B Review Assessmentmentioning
confidence: 99%
“…Apart from that, peer review has also been challenged by the rapid increase of paper submissions. Consider the example of computer science conferences: The Conference on Neural Information Processing Systems (NeurIPS) received 9467 submissions in 2020, which is five times the number of submissions it received in 2010 [5] . The exploding nature of paper submissions leads to paper-vetting by less-experienced researchers from disparate fields due to the shortage of qualified reviewers.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…There are different patterns of human-machine collaboration for solving real-world problems (Kamar, Hacker, and Horvitz 2012;Russakovsky, Li, and Fei-Fei 2015;Chandra et al 2020;Arous et al 2021) while reducing human worker efforts (Yi et al 2012). A typical approach is to use AI programs to identify data items that require attention from human workers (Nguyen, Wallace, and Lease 2015;Yang et al 2019;Wilder, Horvitz, and Kamar 2020;Liu et al 2020).…”
Section: Related Workmentioning
confidence: 99%