2016
DOI: 10.1002/au.30065
|View full text |Cite
|
Sign up to set email alerts
|

Rethinking Interrater Agreement

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2017
2017
2019
2019

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 1 publication
0
2
0
Order By: Relevance
“…I have found the single‐parameter estimates of agreement to be too opaque and complicated to be of much use, and I created a way to look within a set of ratings at the scale in more detail. The new kappa was previewed in Assessment Update (Eubanks 2016) and published in an article within the context of writing assessment (Eubanks 2017). As the statistics in the latter article show, an unusual aspect to our writing ratings is that the upper end of the scale (ready to graduate) shows more agreement than the lower end (not doing college‐level work).…”
Section: Reliabilitymentioning
confidence: 99%
See 1 more Smart Citation
“…I have found the single‐parameter estimates of agreement to be too opaque and complicated to be of much use, and I created a way to look within a set of ratings at the scale in more detail. The new kappa was previewed in Assessment Update (Eubanks 2016) and published in an article within the context of writing assessment (Eubanks 2017). As the statistics in the latter article show, an unusual aspect to our writing ratings is that the upper end of the scale (ready to graduate) shows more agreement than the lower end (not doing college‐level work).…”
Section: Reliabilitymentioning
confidence: 99%
“…P art 1 of this article ( Eubanks 2019 ) described a method of crowdsourcing learning outcomes from course instructors. This half is an exposition of qualities of the data set that has resulted, with a focus on student writing.…”
mentioning
confidence: 99%