Purpose: Performance assessment using high fidelity simulation is problematic, due to the difficulty in developing valid and reliable evaluation tools. The Delphi technique is a consensus based content generation method used for multiple purposes such as policy development, best-evidence practice guidelines and competency assessments. The purpose of this study was to develop checklists using a modified Delphi technique to evaluate the performance of practicing anesthesiologists managing two simulated scenarios.
Methods:The templates for two simulation scenarios were emailed to five anesthesiologists who were asked to generate performance items. Data were collated anonymously and returned. An a priori decision was made to delete items endorsed by ≤ 20% of participants. This process of collection, collation and re-evaluation was repeated until consensus was reached. Four independent raters used the checklist to assess three subjects managing the two simulation scenarios. Interrater reliability was assessed using average measures intraclass correlation (ICC) and repeated measures analysis of variance (ANOVA) was used to assess differences in difficulty between scenarios.
Results:The final checklists included 131 items for scenario 1 and 126 items for scenario 2. The mean inter-rater reliability was 0.921 for scenario 1 and 0.903 for scenario 2. Repeated measures ANOVA revealed no statistically significant difference in difficulty between scenarios.Discussion: The Delphi technique can be very useful to generate consensus based evaluation tools with high content and face validity compared to subjective evaluative tools. Since there was no difference in scenario difficulty, these scenarios can be used to determine the effect of educational interventions on performance.
We found a modest improvement in performance on a DSC in the debriefed group and overall improvement in both control and debriefed groups using a GRS. Whether this improvement translates into clinical practice has yet to be determined.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.