Sociometric Research 1988
DOI: 10.1007/978-1-349-19051-5_6
|View full text |Cite
|
Sign up to set email alerts
|

On Agreement Indices for Nominal Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
44
1
1

Year Published

1991
1991
2023
2023

Publication Types

Select...
7
3

Relationship

0
10

Authors

Journals

citations
Cited by 59 publications
(47 citation statements)
references
References 27 publications
1
44
1
1
Order By: Relevance
“…The correlation (pooled within-groups-correlation coefficient) between both observers with respect to codes on the influence and the proximity dimension is .79 and .80, with a 95% confidence interval between .64 and . 94, and .63 and .97, respectively. We conclude that the interobserver agreement for the instrument measuring daily hassles in the classroom and those measuring student teacher behaviour is near or meets the norm of .80 (Popping, 1988).…”
Section: No~smentioning
confidence: 79%
“…The correlation (pooled within-groups-correlation coefficient) between both observers with respect to codes on the influence and the proximity dimension is .79 and .80, with a 95% confidence interval between .64 and . 94, and .63 and .97, respectively. We conclude that the interobserver agreement for the instrument measuring daily hassles in the classroom and those measuring student teacher behaviour is near or meets the norm of .80 (Popping, 1988).…”
Section: No~smentioning
confidence: 79%
“…As Riffe, Lacy and Fico (1998) report this lack of agreement is due to the different contexts in which the analysis can be conducted. According to Kvalseth (1989), for instance, a kappa coefficient of .61 is an indicator of high agreement, whereas Popping (1988) proposes that a value of .80 represents high overall reliability. After reviewing norms proposed by several methodologists, Neuendorf (2002) concluded that a "coefficient of .90 or greater would be acceptable to all, .80 or greater would be acceptable in most situations, and below that, there exists disagreement" (p. 145).…”
Section: Discussionmentioning
confidence: 99%
“…Intercoder reliability on the final pilot of the coding form with 15% text subsample was Cohen's kappa p .91, p p .000 for the classification of top-level text structure and for the number of features signaling the toplevel structure. Cohen's kappa levels above .80 are generally acceptable and indicative of high reliability (Ellis, 1994;Krippendorff, 2013;Neuendorf, 2002;Popping, 1988). The Coding Form is presented in Figure A1 in Appendix A.…”
Section: Developing a Coding Schemementioning
confidence: 99%