Objective: The study evaluated whether a modified version of the information literacy Valid Assessment of Learning in Undergraduate Education (VALUE) rubric would be useful for assessing the information literacy skills of graduate health sciences students.Methods: Through facilitated calibration workshops, an interdepartmental six-person team of librarians and faculty engaged in guided discussion about the meaning of the rubric criteria. They applied the rubric to score student work for a peer-review essay assignment in the ''Information Literacy for Evidence-Based Practice'' course. To determine inter-rater reliability, the raters participated in a follow-up exercise in which they independently applied the rubric to ten samples of work from a research project in the doctor of physical therapy program: the patient case report assignment.Results: For the peer-review essay, a high level of consistency in scoring was achieved for the second workshop, with statistically significant intra-class correlation coefficients above 0.8 for 3 criteria: ''Determine the extent of evidence needed,'' ''Use evidence effectively to accomplish a specific purpose,'' and ''Access the needed evidence.'' Participants concurred that the essay prompt and rubric criteria adequately discriminated the quality of student work for the peer-review essay assignment. When raters independently scored the patient case report assignment, inter-rater agreement was low and statistically insignificant for all rubric criteria (kappa¼À0.16, p.0.05-kappa¼0.12, p.0.05).Conclusions: While the peer-review essay assignment lent itself well to rubric calibration, scorers had a difficult time with the patient case report. Lack of familiarity among some raters with the specifics of the patient case report assignment and subject matter might have accounted for low inter-rater reliability. When norming, it is important to hold conversations about search strategies and expectations of performance. Overall, the authors found the rubric to be appropriate for assessing information literacy skills of graduate health sciences students.
Objective: The study evaluated whether a modified version of the information literacy Valid Assessment of Learning in Undergraduate Education (VALUE) rubric would be useful for assessing the information literacy skills of graduate health sciences students.Methods: Through facilitated calibration workshops, an interdepartmental six-person team of librarians and faculty engaged in guided discussion about the meaning of the rubric criteria. They applied the rubric to score student work for a peer-review essay assignment in the ''Information Literacy for Evidence-Based Practice'' course. To determine inter-rater reliability, the raters participated in a follow-up exercise in which they independently applied the rubric to ten samples of work from a research project in the doctor of physical therapy program: the patient case report assignment.Results: For the peer-review essay, a high level of consistency in scoring was achieved for the second workshop, with statistically significant intra-class correlation coefficients above 0.8 for 3 criteria: ''Determine the extent of evidence needed,'' ''Use evidence effectively to accomplish a specific purpose,'' and ''Access the needed evidence.'' Participants concurred that the essay prompt and rubric criteria adequately discriminated the quality of student work for the peer-review essay assignment. When raters independently scored the patient case report assignment, inter-rater agreement was low and statistically insignificant for all rubric criteria (kappa¼À0.16, p.0.05-kappa¼0.12, p.0.05).Conclusions: While the peer-review essay assignment lent itself well to rubric calibration, scorers had a difficult time with the patient case report. Lack of familiarity among some raters with the specifics of the patient case report assignment and subject matter might have accounted for low inter-rater reliability. When norming, it is important to hold conversations about search strategies and expectations of performance. Overall, the authors found the rubric to be appropriate for assessing information literacy skills of graduate health sciences students.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.