Expertise in searching and evaluating scientific literature is a requisite skill of trained scientists and science students, yet information literacy instruction varies greatly among institutions and programs. To ensure that science students acquire information literacy skills, robust methods of assessment are needed. Here, we describe a novel tool for longitudinal, crossover assessment of literature-searching skills in science students and apply it to a cross-sectional assessment of literature-searching performance in 145 first-year and 43 senior biology majors. Subjects were given an open-ended prompt requiring them to find multiple sources of information addressing a particular scientific topic. A blinded scorer used a rubric to score the resources identified by the subjects and generate numerical scores for source quality, source relevance, and citation quality. Two versions of the assessment prompt were given to facilitate eventual longitudinal study of individual students in a crossover design. Seniors were significantly more likely to find relevant, peer-reviewed journal articles, provide appropriate citations, and provide correct answers to other questions about scientific literature. This assessment tool accommodates large numbers of students and can be modified easily for use in other disciplines or at other levels of education.doi:10.5860/crl.77.6.682 crl15-842
A Novel Assessment Tool for Quantitative Evaluation 683Introduction Trained scientists in diverse disciplines share certain fundamental skills, and educators in science, technology, engineering, and mathematics (STEM) aim to train students in these cross-disciplinary skills in addition to the skills and knowledge of specific disciplines. Several of these fundamental skills are enumerated within the concept of science information literacy, which has been defined "as a set of abilities to identify the need for information, procure the information, evaluate the information and subsequently revise the strategy for obtaining the information, to use the information and to use it in an ethical and legal manner, and to engage in lifelong learning." 1 Trained scientists must apply these skills routinely, whereas high school students typically evaluate sources and information on a more limited basis.2 To develop these skills in undergraduate students, STEM curricula must incorporate information literacy. 3 However, information literacy instruction varies greatly among institutions and programs. 4 Likewise, assessment of information literacy varies widely in format, information skills and knowledge tested, time frame over which students are monitored, and degree to which student inputs are standardized and measurements are comparable between studies (see table 1).