1994
DOI: 10.1177/026553229401100206
|View full text |Cite
|
Sign up to set email alerts
|

Effects of training on raters of ESL compositions

Abstract: Several effects of training on composition raters have been hypothesized but not investigated empirically. This article presents an analysis of the verbal protocols of four inexperienced raters of ESL placement compositions scoring the same essays both before and after rater training. The verbal protocols show that training clarified the intended scoring criteria for raters, modified their expectations of student writing and provided a reference group of other raters with which raters could compare themselves,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

11
108
0
5

Year Published

1998
1998
2022
2022

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 165 publications
(124 citation statements)
references
References 11 publications
11
108
0
5
Order By: Relevance
“…Whether assessment practice is part of training or not is a crucial issue in training teachers to assess writing. By providing teachers with the opportunity to practice and use the assessment tool that the school or institution encourages, assessment reliability may be improved (Weigle, 1994). These results are also in line with those found by Wiseman (2012), who concluded that rater background had an impact on raters' use of scoring criteria.…”
Section: Discussion Of Resultssupporting
confidence: 78%
See 2 more Smart Citations
“…Whether assessment practice is part of training or not is a crucial issue in training teachers to assess writing. By providing teachers with the opportunity to practice and use the assessment tool that the school or institution encourages, assessment reliability may be improved (Weigle, 1994). These results are also in line with those found by Wiseman (2012), who concluded that rater background had an impact on raters' use of scoring criteria.…”
Section: Discussion Of Resultssupporting
confidence: 78%
“…The results of this study allowed the researchers to conclude that the use of assessment tools, such as scoring rubrics, is not enough to improve rater reliability (González & Roux, 2013;Saxton et al 2012;Weigle, 1994), and therefore these results may have important implications for teaching practice. First, it is important to consider that rubrics are tools that teachers may use to facilitate their assessment of writing.…”
Section: Conclusion and Teaching Implicationsmentioning
confidence: 99%
See 1 more Smart Citation
“…Scoring rubrics sheet called performance descriptors sheet typical of analytical rating, extracted from IELTS Handbook (2007) were given to both raters and examinees. It was submitted to the raters in order mainly to avoid rater's bias (Henning, 1987;Weigle, 1999;Eckes, 2012;Ecks, 2008) and to free them from evaluative personal judgment (Weigle, 1994(Weigle, , 1998) and background variables (Johnson & Lim, 2009) potentially affecting their scoring process. In the case of the examinees, the objective was to engage them in the related prompts (Bachman & Palmer, 2000) and to keep them cognizant of the underlying construct to be tested.…”
Section: Instrumentsmentioning
confidence: 99%
“…Thus, the rater, as opposed to the test materials, candidates or rating scale is used as the window through which the evaluation of second language speaking performance can be observed. This type of qualitative data can tell us about the process of discriminating between candidates (Pollitt and Murray, 1996), and is useful for the purpose of rater training (Weigle, 1994), and for investigating how raters reach their decisions (Milanovic et al, 1996;Taylor, 2000). The latest research of the type in China is a study on the interpretation and application of the assessing criteria in TEM4-Oral (Wang Haizhen, 2008).…”
Section: Introductionmentioning
confidence: 99%