Peer reviewing has been touted as a popular instrument to identify good contributions in communities. A problem of peer reviewing is that reviewers have little incentive to make significant effort. To address this problem, the authors introduce a new variant of peer reviewing. It differs from conventional peer reviewing in two ways: First, peers who have made a contribution must also review the contributions made by others. Second, each contributor issues ratings regarding the reviews he has received. To incentivize reviewing, they design an assessment scheme which does not only assess the quality of the contribution made by a peer, but also the quality of the reviews he has submitted. The scheme ranks peers by overall performance, and the ranks determine their payoff. Such a setting gives way to competition among peers. A core challenge however is to elicit objective reviews and ratings. The authors consider two issues which are in the way of this objectiveness: First, they expect preference bias in ratings, i.e., peers tend to prefer reviews with high scores, but dislike reviews with low scores. Second, strategic peers might defame others in their reviews or ratings. This is because they perceive others as competitors. In this paper, they propose a heuristic to address these issues. Further, they carry out a user study in a lecture scenario to evaluate their scheme. It shows that students are incentivized to submit high-quality reviews and that their scheme is effective to evaluate the performance of students.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.