2020
DOI: 10.1353/pla.2020.0006
|View full text |Cite
|
Sign up to set email alerts
|

A Multi-Method Information Literacy Assessment Program: Foundation and Early Results

Abstract: The information literacy (IL) assessment program at Manhattan College in Riverdale, New York, instituted in 2014-2015, evaluates students' information literacy capabilities as demonstrated in their written coursework, their test performance, and their comments on library instruction sessions. Both instruction and assessment are closely linked to five learning objectives, and five years' assessment results have led to significant changes in the IL instruction program. This paper presents key concepts in IL asse… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
7
0
1

Year Published

2021
2021
2023
2023

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(8 citation statements)
references
References 49 publications
0
7
0
1
Order By: Relevance
“…Regarding the methods of evaluating information literacy, most scholars use only self-designed questionnaires that consist of self-assessments with closed-ended test questions (Pinto et al, 2019 ). Other scholars have combined interviews (Walters et al, 2020 ), experiments (Ding & Ma, 2013 ) and other survey methods to evaluate the developmental level of information literacy. However, these methods lack enthusiasm and flexibility, and the data collection process requires substantial cooperation from users, which is time-consuming and laborious.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Regarding the methods of evaluating information literacy, most scholars use only self-designed questionnaires that consist of self-assessments with closed-ended test questions (Pinto et al, 2019 ). Other scholars have combined interviews (Walters et al, 2020 ), experiments (Ding & Ma, 2013 ) and other survey methods to evaluate the developmental level of information literacy. However, these methods lack enthusiasm and flexibility, and the data collection process requires substantial cooperation from users, which is time-consuming and laborious.…”
Section: Discussionmentioning
confidence: 99%
“…With the development of the Internet, big data, artificial intelligence and other emerging technologies, massive and diverse process data can be recorded, which provides the possibility of data-driven evaluation. Various studies on technology behavior have used user process data; for example, Han et al ( 2019 ) assessed the degrees and features of teachers’ online participation in BL implementation, and Walters et al ( 2020 ) used students’ comments on a library instruction session to evaluate their information literacy. Kim et al ( 2020 ) used online search behavior to identify differences between self-perceived eHealth literacy and performance in judging the authenticity of cancer information.…”
Section: Discussionmentioning
confidence: 99%
“…26 Walters asserts that "the distinction between evidence-based and perception-based measures is nearly identical to the distinction between cognitive and affective measures." 27 Cognitive, or evidence-based, measures look at data in which students demonstrate their skills as opposed to affective measures that examine students' confidence (for example) using those skills. Of indirect versus direct methods of data collection--which Walters says "are closely linked to cognitive or affective constructs" 28 --Walters explains that indirect methods, such as surveys or self-evaluations, in which students report what they do, build in a level of abstraction that introduces bias.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Of indirect versus direct methods of data collection--which Walters says "are closely linked to cognitive or affective constructs" 28 --Walters explains that indirect methods, such as surveys or self-evaluations, in which students report what they do, build in a level of abstraction that introduces bias. 29 Direct methods involve researchers looking "directly" at the data about which they are making claims (for example, looking at students' actual coursework rather than looking at what students say about having produced it). Camacho looked at how many students had watched an ILI video but did not measure how much students understood of what they had seen.…”
Section: Literature Reviewmentioning
confidence: 99%
See 1 more Smart Citation