2015
DOI: 10.1016/j.compcom.2015.01.006
|View full text |Cite
|
Sign up to set email alerts
|

Issues in transitioning from the traditional blue-book to computer-based writing assessment

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(8 citation statements)
references
References 24 publications
(60 reference statements)
0
8
0
Order By: Relevance
“…The ongoing development of grammar/style and spelling checkers has prompted several scholars to revisit these tools and their pedagogic uses (Buck 2008, Figueredo and Varnhagen 2006, McGee and Ericsson 2002, Potter and Fuller 2008, Vernon 2000. Investigations which compare the processes of composing by hand or on a computer explore their respective effects on the cognitive process (Medimorec and Risko 2016), early writing outcomes (Wollscheid, Sjaastad and Tømte 2016), and the implications for time-constrained assessments (Hunsu 2015, Mogey and Fluck 2015, Whithaus, Harrison and Midyette 2008. While research in the latter field assumes a widespread familiarity with word-processing and the supersession of handwriting with keyboarding skills, it also acknowledges that not all students are equally comfortable composing on a computer.…”
Section: Introductionmentioning
confidence: 99%
“…The ongoing development of grammar/style and spelling checkers has prompted several scholars to revisit these tools and their pedagogic uses (Buck 2008, Figueredo and Varnhagen 2006, McGee and Ericsson 2002, Potter and Fuller 2008, Vernon 2000. Investigations which compare the processes of composing by hand or on a computer explore their respective effects on the cognitive process (Medimorec and Risko 2016), early writing outcomes (Wollscheid, Sjaastad and Tømte 2016), and the implications for time-constrained assessments (Hunsu 2015, Mogey and Fluck 2015, Whithaus, Harrison and Midyette 2008. While research in the latter field assumes a widespread familiarity with word-processing and the supersession of handwriting with keyboarding skills, it also acknowledges that not all students are equally comfortable composing on a computer.…”
Section: Introductionmentioning
confidence: 99%
“…Lee, 2002). Hunsu et al (2015) point out that the key factors affecting writing outcomes are the characteristics of participants, such as language proficiency (Breland et al, 2005) and competence with word processors (Bridgeman & Cooper, 1998;Harrington et al, 2000;Russell, 1999;Wolfe & Manalo, 2004). However, typing skill or experience with word processors may be less of a concern to students nowadays, with their access to computers and e-devices (Hunsu, 2015).…”
Section: Typewriting Chinese With Pinyin Input and Evaluation Of Typewritten Textsmentioning
confidence: 99%
“…Specifically, comments such as "really tried hard" for handwritten essays demonstrate the empathy of evaluators. For the exact reason that the effect of the two kinds of presentation (handwritten and typewritten) on evaluators is unclear, as pointed out by Hunsu (2015), the current study focuses on the empathy effect mentioned in previous research. In addition, both Harrington et al (2000) and Powers et al (1994) advocate that a way to reduce the evaluation discrepancy is to explicitly train essay raters with extra emphasis on essay quality rather than the method of writing.…”
Section: Typewriting Chinese With Pinyin Input and Evaluation Of Typewritten Textsmentioning
confidence: 99%
“…On the contrary, another line of research has identified counterevidence for the positive relationship between onscreen annotation and rating quality. For instance, Hunsu (2015) argued that raters were also readers; when marking scripts for a test, raters were required to read these written texts carefully and went on to select a score that accurately reflected the quality. Given that raters' reading behaviors may differ in the two scoring modes, it is quite likely that raters scored scanned essays and their paper originals at differing levels of severity.…”
Section: Comparability Of Oss Mode and Pbs Modementioning
confidence: 99%
“…With the continual advances in information technology and the widespread use of the Internet, many test development agencies have replaced traditional paper-based scoring (PBS) with onscreen scoring (OSS), also known as onscreen marking (OSM; Bennett, 2003;Chen, White, McCloskey, Soroui, & Chun, 2011;Coniam, 2011b;Hunsu, 2015). The two modes differ greatly in how testees' writing is presented to human raters.…”
Section: Introductionmentioning
confidence: 99%