2019
DOI: 10.1016/j.asw.2019.03.001
|View full text |Cite
|
Sign up to set email alerts
|

Developing and examining validity evidence for the Writing Rubric to Inform Teacher Educators (WRITE)

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 17 publications
(11 citation statements)
references
References 33 publications
0
10
0
1
Order By: Relevance
“…At this stage of dissemination, the learning process situation feels somewhat different because students in this deployment stage are easier to understand what is conveyed by the teacher. The students are all enthusiastic, actively asking questions, giving opinions and working on each step in the teaching material [39], [40]. Therefore, the average writing result of students at the distribution stage is higher than the trial phase.…”
Section: Resultsmentioning
confidence: 99%
“…At this stage of dissemination, the learning process situation feels somewhat different because students in this deployment stage are easier to understand what is conveyed by the teacher. The students are all enthusiastic, actively asking questions, giving opinions and working on each step in the teaching material [39], [40]. Therefore, the average writing result of students at the distribution stage is higher than the trial phase.…”
Section: Resultsmentioning
confidence: 99%
“…In other words, many researchers are interested in the degree to which rating errors and systemic biases introduce irrelevant structural variation in the interpretation of ratings. Concerning rater impacts, features like rubrics can also lead to psychometric constraints in rater-mediated writing assessments (Hodges et al, 2019).…”
Section: Literature Reviewmentioning
confidence: 99%
“…To evaluate student writing and improve writing outcomes, there is evidence to support using general outcome measures (e.g., number of correct writing sequences), and there is evidence demonstrating that analytic rubrics are effective, valid, and reliable measures of writing performance (Harmey et al., 2019; Hodges et al., 2019). The CBM-W measures referred to earlier can supplement the rubric data and collectively be used to inform intervention planning (Allen et al., 2018).…”
Section: The Dddm Processmentioning
confidence: 99%
“…This is especially helpful when explaining a grade or an instructional plan to a parent (Swain & Friedrich, 2018). Although rubrics are not technically adequate to be a measure of progress monitoring (Heldsinger & Humphry, 2013; McMaster & Espin, 2007), there is an evidence base demonstrating their effectiveness in evaluating student writing (Harmey et al., 2019; Hodges et al., 2019).…”
mentioning
confidence: 99%