Digital information literacy (DIL)—generally defined as the ability to obtain, understand, evaluate, and use information in a variety of digital technology contexts—is a critically important skill deemed necessary for success in higher education as well as in the global networked economy. To determine whether college graduates possess the requisite knowledge and skills in DIL, higher education institutions must be able to administer and use results from valid assessments of DIL. In this paper, we provide a comprehensive review of existing definitions of this construct in major frameworks from higher education and the workforce and propose an operational definition of DIL. Next, we provide a review of existing assessments of information literacy and related constructs, including features of the assessments, construct alignment, and psychometric properties (i.e., reliability and validity evidence). Finally, we discuss challenges and considerations surrounding the design, implementation, and use of next‐generation assessments of DIL. We offer this review as a resource for higher education institutions in selecting among existing assessments or in designing their own measures.
Problem‐solving strategy is frequently cited as mediating the effects of response format (multiple‐choice, constructed response) on item difficulty, yet there are few direct investigations of examinee solution procedures. Fifty‐five high school students solved parallel constructed response and multiple‐choice items that differed only in the presence of response options. Student performance was videotaped to assess solution strategies. Strategies were categorized as “traditional”–those associated with constructed response problem solving (e.g., writing and solving algebraic equations)–or “nontraditional”–those associated with multiple‐choice problem solving (e.g., estimating a potential solution). Surprisingly, participants sometimes adopted nontraditional strategies to solve constructed response items. Furthermore, differences in difficulty between response formats did not correspond to differences in strategy choice: some items showed a format effect on strategy but no effect on difficulty; other items showed the reverse. We interpret these results in light of the relative comprehension challenges posed by the two groups of items.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.