2021
DOI: 10.1002/jaba.811
|View full text |Cite
|
Sign up to set email alerts
|

Interobserver agreement: A preliminary investigation into how much is enough?

Abstract: Interobserver agreement (IOA) is important for research and practice, and supports the consistency of behavioral data (Kahng et al., 2011). Although general parameters for how much IOA is needed have been suggested (Bailey & Burch, 2018), it is unknown if the total number of sessions with IOA might impact the IOA coefficient. In this study, IOA was reanalyzed using functional analysis data at various cutoffs. Obtained IOA from these analyses was then compared to the original IOA. Overall, results suggested tha… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

1
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(8 citation statements)
references
References 18 publications
1
7
0
Order By: Relevance
“…Most studies we examined reported IOA for measures of the participants' behavior (i.e., dependent variables), which is consistent with prior findings (e.g., Falakfarsa et al, 2022 ; Hausman et al, 2022 ; Kostewicz et al, 2016 ). Far fewer reported IOA for procedural fidelity.…”
Section: Discussionsupporting
confidence: 89%
See 1 more Smart Citation
“…Most studies we examined reported IOA for measures of the participants' behavior (i.e., dependent variables), which is consistent with prior findings (e.g., Falakfarsa et al, 2022 ; Hausman et al, 2022 ; Kostewicz et al, 2016 ). Far fewer reported IOA for procedural fidelity.…”
Section: Discussionsupporting
confidence: 89%
“…IOA is now widely reported in research articles published in applied behavior analysis journals and discussed as part of the training of behavior analysts. For example, Hausman et al ( 2022 ) found that 98.9% of studies published in the Journal of Applied Behavior Analysis ( JABA ) from 2014 through 2018 reported IOA. Cooper et al ( 2020 ) devoted 10 pages (pp.…”
mentioning
confidence: 99%
“…Interrater agreement was measured by having a second, independent rater code 20% of the articles (Hausman et al, 2021; Kratochwill et al, 2010) meeting the inclusion criteria for the variables described above; these articles were selected at random. An agreement was scored if both raters coded an item identically (e.g., both coded “concurrent‐chains” as the method for determining preference).…”
Section: Methodsmentioning
confidence: 99%
“…If, for instance, two observers randomly coded 15% of individuals’ performance as “Pass” and 85% of individuals’ performance as “Fail,” their percentage agreement would be about 75% due to chance alone. Furthermore, percent agreement is also influenced by the amount and types of scores collected for calculating agreement (Hausman et al, 2022). An estimate that accounts for chance agreement, such as kappa or weighted kappa, may be more accurate.…”
Section: Discussionmentioning
confidence: 99%