2009
DOI: 10.1177/0265532208101006
|View full text |Cite
|
Sign up to set email alerts
|

A meta-analysis of test format effects on reading and listening test performance: Focus on multiple-choice and open-ended formats

Abstract: A meta-analysis was conducted on the effects of multiple-choice and open-ended formats on L1 reading, L2 reading, and L2 listening test performance. Fifty-six data sources located in an extensive search of the literature were the basis for the estimates of the mean effect sizes of test format effects. The results using the mixed effects model of meta-analysis indicate that multiple-choice formats are easier than open-ended formats in L1 reading and L2 listening, with the degree of format effect ranging from sm… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

10
52
0
3

Year Published

2011
2011
2020
2020

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 88 publications
(65 citation statements)
references
References 32 publications
10
52
0
3
Order By: Relevance
“…As Table 13 shows, no significant differences were found. Similar to earlier findings by Hohensinn and Kubinger (2011) and In'nami and Koizumi (2009), the CR items seemed slightly more difficult.…”
Section: Discussionsupporting
confidence: 88%
See 1 more Smart Citation
“…As Table 13 shows, no significant differences were found. Similar to earlier findings by Hohensinn and Kubinger (2011) and In'nami and Koizumi (2009), the CR items seemed slightly more difficult.…”
Section: Discussionsupporting
confidence: 88%
“…Lissitz and Hou (2012) argued that different response formats do not necessarily elicit the same expression of skills and that CR items may assess skills in a different manner than MC items do. Lastly, MC items were generally found easier then CR items (e.g., In'nami & Koizumi, 2009). In the present study, the validity of both multiple-choice and constructed response revision items is evaluated, as well as the differences in test characteristics of both formats.…”
Section: Response Formatsmentioning
confidence: 92%
“…Furthermore, our analogy tasks were openended; children had to construct their answers so enabling us to undertake fine-grained analyses of children's solving processes (e.g., Harpaz-Itay, Kaniel, & Ben-Amram, 2006;Resing & Elliott, 2011;Tzuriel & Galinka, 2000). Although we know that such tasks are more difficult than multiple choice tasks (e.g., Behuniak, Rogers, & Dirir, 1996;In'nami & Kozumi, 2009;Martinez, 1999), and individuals require more help during training to solve these tasks (Stevenson et al, 2016), these type of tasks may improve learning after extensive instruction.…”
mentioning
confidence: 99%
“…Indubitably, test format affects test performance (Shohamy, 1984;Wolf, 1993;Kobayashi, 2002;Bowles & Salthouse, 2008;In'nami & Koizumi, 2009). Bachman and Palmer (1996, p. 46) are of the opinion that "the characteristics of the tasks used are always likely to affect test scores to some degree, so that there is virtually no test that yields only information about the ability we want to measure".…”
Section: Discussionmentioning
confidence: 99%