Problem‐solving strategy is frequently cited as mediating the effects of response format (multiple‐choice, constructed response) on item difficulty, yet there are few direct investigations of examinee solution procedures. Fifty‐five high school students solved parallel constructed response and multiple‐choice items that differed only in the presence of response options. Student performance was videotaped to assess solution strategies. Strategies were categorized as “traditional”–those associated with constructed response problem solving (e.g., writing and solving algebraic equations)–or “nontraditional”–those associated with multiple‐choice problem solving (e.g., estimating a potential solution). Surprisingly, participants sometimes adopted nontraditional strategies to solve constructed response items. Furthermore, differences in difficulty between response formats did not correspond to differences in strategy choice: some items showed a format effect on strategy but no effect on difficulty; other items showed the reverse. We interpret these results in light of the relative comprehension challenges posed by the two groups of items.
This study investigated the strategies subjects adopted to solve stem‐equivalent SAT‐Mathematics (SAT‐M) word problems in constructed‐response (CR) and multiple‐choice (MC) formats. Parallel test forms of CR and MC items were administered to subjects representing a range of mathematical abilities. Format‐related differences in difficulty were more prominent at the item level than for the test as a whole. At the item level, analyses of subjects' problem‐solving processes appeared to explain difficulty differences as well as similarities. Differences in difficulty derived more from test‐development than from cognitive factors: On items in which large format effects were observed, the MG response options often did not include the erroneous answers initially generated by subjects. Thus, the MC options may have given unintended feedback when a subject's initial answer was not an option or allowed a subject to choose the correct answer based on an estimate. Similarities between formats occurred because subjects used similar methods to solve both CR and MC items. Surprisingly, when solving CR items, subjects often adopted strategies commonly associated with MC problem solving. For example, subjects appeared adept at estimating plausible answers to CR items and checking those answers against the demands of the item stem. Although there may be good reasons for using constructed‐response items in large‐scale testing programs, multiple‐choice questions' of the sort studied here should provide measurement that is generally comparable to stem‐equivalent constructed‐response items.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.