2005
DOI: 10.1177/1094428105275376
|View full text |Cite
|
Sign up to set email alerts
|

Interrater Agreement Reconsidered: An Alternative to the rwg Indices

Abstract: For continuous constructs, the most frequently used index of interrater agreement (r wg(1))can be problematic. Typically, rwg(1) is estimated with the assumption that a uniform distribution represents no agreement. The authors review the limitations of this uniform nullr wg(1) index and discuss alternative methods for measuring interrater agreement. A new interrater agreement statistic,a wg(1),is proposed. The authors derive thea wg(1)statistic and demonstrate thatawg(1) is an analogue to Cohen’s kappa, an int… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
269
1
1

Year Published

2008
2008
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 259 publications
(272 citation statements)
references
References 44 publications
1
269
1
1
Order By: Relevance
“…Work-unit consensus was estimated by Brown and Hauenstein's (2005) We also calculated two types of intraclass correlation coefficients (ICCs) for job satisfaction and organizational commitment using the individual-level data. ICC(1) is a ratio of between-group to total variance (including between-and within-group variance) in scores.…”
Section: Data Analysis Approachmentioning
confidence: 99%
“…Work-unit consensus was estimated by Brown and Hauenstein's (2005) We also calculated two types of intraclass correlation coefficients (ICCs) for job satisfaction and organizational commitment using the individual-level data. ICC(1) is a ratio of between-group to total variance (including between-and within-group variance) in scores.…”
Section: Data Analysis Approachmentioning
confidence: 99%
“…Data showed strong agreement which justify data aggregation to a higher level (Cohen, Doveh & Nahum-Shani, 2009;Brown & Hauenstein, 2005). All schools were aggregated except two schools because the number of respondents was less than 5.…”
Section: Resultsmentioning
confidence: 99%
“…To ensure that ratings could be aggregated, we evaluated inter-rater agreement (IRA) following literature recommendations [61,62], using three families of indices: James et al's r WG(J) [63,64] (based on multiple null distributions) [65], Brown and Hauenstein's a WG(J) [66]; and the adjusted average deviation index A DMJ(adj) [67]. In addition to the arithmetic mean of each uMARS score, we calculated a "response databased weighted mean" (WDMEAN) [68].…”
Section: Data Analysesmentioning
confidence: 99%