2017
DOI: 10.1057/palcomms.2017.40
|View full text |Cite
|
Sign up to set email alerts
|

The future of societal impact assessment using peer review: pre-evaluation training, consensus building and inter-reviewer reliability

Abstract: There are strong political reasons underpinning the desire to achieve a high level of inter-reviewer reliability (IRR) within peer review panels. Achieving a high level of IRR is synonymous with an efficient review system, and the wider perception of a fair evaluation process. Therefore, there is an arguable role for a more structured approach to the peer review process during a time when evaluators are effectively novices in practice with the criterion, such as with societal impact. This article explores the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0
2

Year Published

2017
2017
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 22 publications
(12 citation statements)
references
References 34 publications
0
10
0
2
Order By: Relevance
“…The use of calibration exercises prior to the formal evaluation taking place within panels (Derrick and Samuel 2017), was used successfully during REF2014 as a mechanism to assist robust discussion around Impact, as well as provide an opportunity for panel members to clarify expectations and form a common lens to guide the impact evaluation (Derrick 2018). Calibration exercises, especially when the evaluation is anticipated as more complex, as was the case of Impact in REF2014, are used as an exercise in maintaining consistency and fairness in evaluation.…”
Section: Additional Calibration To Accommodate Mitigation and Virtual Decision-makingmentioning
confidence: 99%
“…The use of calibration exercises prior to the formal evaluation taking place within panels (Derrick and Samuel 2017), was used successfully during REF2014 as a mechanism to assist robust discussion around Impact, as well as provide an opportunity for panel members to clarify expectations and form a common lens to guide the impact evaluation (Derrick 2018). Calibration exercises, especially when the evaluation is anticipated as more complex, as was the case of Impact in REF2014, are used as an exercise in maintaining consistency and fairness in evaluation.…”
Section: Additional Calibration To Accommodate Mitigation and Virtual Decision-makingmentioning
confidence: 99%
“…Moreover, in practice, case studies often utilize academic bibliometrics instead of social metrics, making it more difficult to understand the verification logic of impact. The pursuit of normalizing impact criteria and evaluation mechanics also encounter academic peer-review, which is often used as expertise in case studies and other forms of panel assessment (Derrick & Samuel 2017;Derrick 2018, 11).…”
Section: Strategies To Verify Impactsmentioning
confidence: 99%
“…It seems that interdisciplinarity as a form of academic self-control is easier to comprehend than external social control indicating lack of assessment culture for such conceptualization (cf. Derrick & Samuel 2017). Ostensibly, academic differentiation seems not to fit in the frame of impact expertise emphasizing multidisciplinary aims of knowledge, but, yet, disciplinary knowledge and understanding had their place in the informants' views.…”
Section: Meta-expertise Across Discipline Boundariesmentioning
confidence: 99%
“…To have a foundation guide for evaluators, the alignment of what should be considered impact is necessary to reach a consensus in the evaluation, regardless of their diverse opinions and ideas. Derrick and Samuel (2017) suggest, for example, having pre-evaluation training.…”
mentioning
confidence: 99%