1994
DOI: 10.1007/bf01575179
|View full text |Cite
|
Sign up to set email alerts
|

A study of the effect of HyperCard and pen-paper performance assessment methods on expert-novice chemistry problem solving

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

1996
1996
2021
2021

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 19 publications
0
5
0
Order By: Relevance
“…In the majority of studies of the use of ICT for assessment of creative and critical thinking, reviewed by Harlen & Deakin Crick (2003b), the assessment was intended to help development of understanding and skills as well as to assess attainment in understanding and skills. The effectiveness of computer programs for both these purposes was demonstrated by those studies where computer-based assessment was compared with assessment by paper and pencil (Jackson, 1989;Kumar et al, 1993). The mechanism for the formative impact was the feedback that students received from the program.…”
Section: Using Summative Assessment To Help Learningmentioning
confidence: 99%
“…In the majority of studies of the use of ICT for assessment of creative and critical thinking, reviewed by Harlen & Deakin Crick (2003b), the assessment was intended to help development of understanding and skills as well as to assess attainment in understanding and skills. The effectiveness of computer programs for both these purposes was demonstrated by those studies where computer-based assessment was compared with assessment by paper and pencil (Jackson, 1989;Kumar et al, 1993). The mechanism for the formative impact was the feedback that students received from the program.…”
Section: Using Summative Assessment To Help Learningmentioning
confidence: 99%
“…This interpretation is supported by previous studies involving the role of the graphing environment (Sinclair and de Freitas 2013), graphing experience (Roth and McGinn, 1997), work on other assessments showing such learning through intermediate constraint question formats (Meir et al, 2019) and in a constrained experimental design task (Meir, in preparation). While prior studies explicitly comparing science skill assessment by environment are rare, at least one prior study in a similar context also found that students performed better on digital assessment than on pen-and-paper, and similarly speculated that the constraints and affordances aided students in drawing on latent knowledge (Kumar et al, 1994).…”
Section: The Constraints and Affordances Of Digital Assessments May Amentioning
confidence: 99%
“…With notable exceptions (e.g., in biology Urban-Lurain et al, 2013;Beggrow et al, 2014;Weston et al, 2015;Vitale et al, 2015Vitale et al, , 2019Zhai et al, 2020), there are fewer examples of digital tools designed to automatically measure practices in complex, less well-defined problem spaces, in large part because of the difficulty in valid automatic scoring (e.g., Beggrow et al, 2014;Ha and Nehm, 2016). Furthermore, few studies compare assessment between pen and paper vs. digital formats, and those have differing conclusions about student performance across the two environments (e.g., Kumar et al, 1994;Aberg-Bengtsson, 2006;Guimaraes et al, 2018;Oqvist and Nouri, 2018). As development of digital assessment tools increases, the field would benefit from more research on how differences in paper versus digital environment interact with the way students demonstrate understanding through performance tasks.…”
Section: Revealing and Assessing Student Science Practice Knowledgementioning
confidence: 99%
“…Kumar, White, and Helgeson found that high school students who used a Hypercard-based chemistry test outperformed students using a pen-paper version of the same task [36]. The task consisted of balancing five chemical equations.…”
Section: Student Testingmentioning
confidence: 99%