Background Participatory design methods are a key component of designing tailored implementation strategies. These methods vary in the resources required to execute and analyze their outputs. No work to date has examined the extent to which the output obtained from different approaches to participatory design varies. Methods We concurrently used two separate participatory design methods: (1) field observations and qualitative interviews (i.e., traditional contextual inquiry) and (2) rapid crowd sourcing (an innovation tournament). Our goal was to generate and compare information to tailor implementation strategies to increase the use of evidence-based data collection practices among one-to-one aides working with children with autism. Each method was executed and analyzed by study team members blinded to the output of the other method. We estimated the personnel time and monetary costs associated with each method to further facilitate comparison. Results Observations and interviews generated nearly double the number of implementation strategies (n = 26) than did the innovation tournament (n = 14). When strategies were classified into implementation strategies from the Expert Recommendations for Implementing Change (ERIC) taxonomy, there was considerable overlap in the content of identified strategies. However, strategies derived from observations and interviews were more specific than those from the innovation tournament. Five strategies (13%) reflected content unique to observations and interviews and 3 (8%) strategies were unique to the innovation tournament. Only observations and interviews identified implementation strategies related to adapting and tailoring to context; only the innovation tournament identified implementation strategies that used incentives. Observations and interviews required more than three times the personnel hours than the innovation tournament, but the innovation tournament was more costly overall due to the technological platform used. Conclusions There was substantial overlap in content derived from observations and interviews and the innovation tournament, although there was greater specificity in the findings from observations and interviews. However, the innovation tournament yielded unique information. To select the best participatory design approach to inform implementation strategy design for a particular context, researchers should carefully consider unique advantages of each method and weigh the resources available to invest in the process.
Data collection is an important component of evidence-based behavioral interventions for children with autism, but many one-to-one aides (i.e., behavioral support staff) do not systemically collect quantitative data that are necessary for best-practice client progress monitoring. Data collection of clients’ behaviors often involves labor-intensive pen-and-paper practices. In addition, the solitary nature of one-to-one work limits opportunities for timely supervisor feedback, potentially reducing motivation to collect data. We incorporated principles from behavioral economics and user-centered design to develop a phone-based application, Footsteps, to address these challenges. We interviewed nine one-to-one aides working with children with autism and seven supervisors to ask for their app development ideas. We then developed the Footsteps app prototype and tested the prototype with 10 one-to-one aides and supervisors through three testing cycles. At each cycle, one-to-one aides rated app usability. Participants provided 76 discrete suggestions for improvement, including 29 new app features (e.g., behavior timer), 20 feature modifications (e.g., numeric type-in option for behavior frequency), four flow modifications (e.g., deleting a redundant form), and 23 out-of-scope suggestions. Of the participants that tested the app, 90% rated usability as good or excellent. Results support continuing to develop Footsteps and testing its impact in a clinical trial.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.