This paper reports on a research project that examined the test scores of students who took part in an online information literacy course. Researchers analyzed the pre-and post-test scores of students who received different types of instruction including a traditional one-shot library session and an online course. Results show that students who participated in the online course demonstrated significant improvement in their test scores compared to the other students. This study shows freshman students' needs for more comprehensive information literacy instruction. It also shows that information literacy instruction can be effective when delivered online. BackgroundThe University of Arizona Libraries (UAL) has had a decades-long history of working with the English Department's Freshman Writing Program in support of the goals of the General Education Curriculum. Until recently, information literacy (IL) instruction for composition courses had traditionally taken the form of one-shot, fifty-minute sessions for individual course sections. However, beginning in late 2008, the libraries strategically began moving more of their instructional activities online in an effort to more efficiently and effectively deliver information literacy instruction to a greater number of students in an anytime/anywhere environment. With a yearly increase in
Purpose -With a call for increased accountability for student learning across higher education, it is becoming more important for academic libraries to show their value to the greater university community with the use of quantitative data. This paper seeks to describe the development of an information literacy test at the University of Arizona to measure student learning in an online credit course. In order to measure the impact of an online course, a test that was statistically valid, and reliable was created by local librarians. Design/methodology/approach -The methodology involved administering test items to undergraduate students enrolled in an online information literacy course and applying both classical test theory and item response theory models to evaluate the validity and reliability of test items. This study included the longitudinal and cross-sectional development of test items for pre and post-testing across different student groups. Over the course of two semesters, 125 items were developed and administered to over 1,400 students. Findings -The creation of test of items and the process of making test items reliable and valid is discussed in detail. Items were checked for construct validity with the use of a national standardized test of information literacy (SAILS). Locally developed items were found to have a higher than average reliability rating. Practical implications -The process described here offers a method for librarians without a background in assessment to develop their own statistically valid and reliable instrument. Originality/value -One of the unique features of this research design was the correlation of SAILS items with local items to test for validity. Although SAILS items have been used by many libraries in the past, they have not been used to create new test items. The use of the original SAILS test items is a valuable resource for instruction librarians developing items locally.
This study investigates whether the type of instruction (a single face-to-face librarian-led instruction, instructor-led instruction, or an online IL course - the Online Research Lab) has an impact on student information literacy gains in a Freshman English Composition program. A performance-based assessment was carried out by analyzing bibliographies in a required controversy paper. Descriptive, correlation, and regression analysis showed that the type of instruction did impact the quality of the bibliographies. Students in the online IL course had higher quality bibliographies than those students who received a one-session face-to-face instruction.
Purpose There is a lack of data about information literacy (IL) credit courses in US academic libraries. This paper aims to provide a detailed snapshot of IL credit courses, including percentages of libraries that offer credit courses, the number of credits offered, the audience and how public institutions differ from private nonprofits and for-profits. Design/methodology/approach The authors surveyed a stratified random sample of libraries at higher education institutions across all categories from the Carnegie Classification of Institutions of Higher Education. Qualtrics software was used to create and distribute the email survey. The response rate was 39 per cent (n = 691). Findings In all, 19 per cent of the institutions in the survey have IL credit courses taught by librarians. Large institutions, public institutions and those granting doctoral degrees are the most likely to offer IL credit courses. The majority of these courses are undergraduate electives of 1-2 credit hours offered under the library aegis, although a significant minority are required, worth 3-4 credit hours, and taught within another academic department or campus-wide program. Originality/value The findings update previous surveys and provide a more granular picture of the characteristics of librarian-taught credit-bearing courses, the types of academic institutions that offer them and compensation teaching librarians receive. This survey is the first study of credit-bearing IL instruction to include for-profit colleges and universities.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.