Initially excluded from many evaluations of education research, single-case designs have recently received wider acceptance within and beyond special education. The growing approval of single-case design has coincided with an increasing departure from convention, such as the visual analysis of results, and the emphasis on effect sizes comparable with those associated with group designs. The use of designcomparable effect sizes by the What Works Clearinghouse has potential implications for the experimental literature in special education, which is largely composed of single-case designs that may not meet the assumptions required for statistical analysis. This study examined the compatibility of single-case design studies appearing in 33 special education journals with the design-comparable effect sizes and related assumptions described by the What Works Clearinghouse. Of the 1,425 randomly selected single-case design articles published from 1999 to 2021, 59.88% did not satisfy assumptions related to design, number of participants, or treatment replications. The rejection rate varied based on journal emphasis, with publications dedicated to students with developmental disabilities losing the largest proportion of articles. A description of the results follows a discussion of the implications for the interpretation of the evidence base.
Impact and ImplicationsA survey of a sample of articles in special education identified single-case designs published over a 22-year period. Analyses noted whether articles were compatible with the design-comparable effect size required in reviews of evidence of the What Works Clearinghouse. Results suggest that a large proportion of studies are eliminated for consideration, which can limit the extent to which scientific evidence can guide the selection of interventions for certain populations and influence research conducted in schools and other applied settings.