2022
DOI: 10.1037/edu0000646
|View full text |Cite
|
Sign up to set email alerts
|

A multimedia effect for multiple-choice and constructed-response test items.

Abstract: This classroom experiment investigates the effects of adding representational pictures to multiple-choice and constructed-response test items to understand the role of the response format for the multimedia effect in testing. Participants were 575 fifth- and sixth-graders who answered 28 science test items—seven items in each of four experimental conditions in a balanced 2 × 2 within-subject design, with the factors being multimedia (text-only vs. text-picture) and response format (multiple-choice vs. construc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3

Relationship

2
6

Authors

Journals

citations
Cited by 10 publications
(11 citation statements)
references
References 62 publications
1
10
0
Order By: Relevance
“…To the best of our knowledge, there is no meta-analytic evidence regarding the potentially moderating role of test format characteristics during a feedback intervention. However, the response format of a task has been shown to have a substantial impact on performance outcomes (Lindner et al, 2020) which might also influence the effectiveness of feedback. A study by Attali (2015) indicated that feedback was more effective when applied to constructed response items than when applied to multiple-choice items.…”
Section: Definitions and Functions Of Feedbackmentioning
confidence: 99%
See 1 more Smart Citation
“…To the best of our knowledge, there is no meta-analytic evidence regarding the potentially moderating role of test format characteristics during a feedback intervention. However, the response format of a task has been shown to have a substantial impact on performance outcomes (Lindner et al, 2020) which might also influence the effectiveness of feedback. A study by Attali (2015) indicated that feedback was more effective when applied to constructed response items than when applied to multiple-choice items.…”
Section: Definitions and Functions Of Feedbackmentioning
confidence: 99%
“…In contrast, constructed response items do not present choice options and learners have to generate a response themselves. It can be assumed that this generation process leads to better encoding and encourages the learner to engage in more effortful response behavior (Clariana, 2003; Lindner et al, 2020). Accordingly, feedback might be able to support this process effectively by providing additional information to encourage the reorganization of knowledge.…”
Section: Definitions and Functions Of Feedbackmentioning
confidence: 99%
“…Despite the benefits of integration aids on learning outcomes, the context and the goals of testing and learning environments are not the same. Thus, specific research focused on (digital) testing is required to better understand how and to what extent multimedia design principles (that have been established in the learning domain) can support test item design (Saß et al., 2017; Lindner et al., 2020). This research has direct practical implications for test designers, namely how different integration aids might be implemented to support item processing and administer test items in a way that enhances students’ test‐taking experience.…”
Section: Methodsmentioning
confidence: 99%
“…First, our study shows that, when carefully applied, multimedia design principles for reducing split-attention in the learning domain can be implemented to help test takers' item processing and reduce construct-irrelevant variance caused by extraneous load in test item design. The successful transfer of multimedia design principles from the learning to the testing domain speaks to one of the commonalities of learning and testing which is that both require an understanding of the presented information (Jarodzka et al, 2015;Lindner et al, 2017Lindner et al, , 2020. Thus, design principles that facilitate comprehension of learning materials have a potential to facilitate test takers' comprehension of test items as well (Lindner, 2020).…”
Section: Implications For Computer-based Test Item Designmentioning
confidence: 99%
“…erefore, efforts must be taken to ensure that student's performance on computer-based tests is an accurate indicator of content competency [19]. Adding multiple-choice options to a test item can support students' problem-solving process by narrowing down the mental search space for potential responses so that students do not need to engage in processes of generating their response, which reduces complexity in the problem-solving process [21]. Alternative assessments featuring ICT enable ESL teachers to gauge, improve, and redirect their instruction in ways in which teachers can answer the requirements of their students rather than relying exclusively on conventional testing formats which neither resemble the technology-enhanced instructional approaches adopted inside the classroom nor successfully reflect the fundamental skills and proficiency of the ESL learners [22].…”
Section: Computer-based Assisted Assessment or E-assessmentmentioning
confidence: 99%