Background
Despite advances in surgical training, microsurgery is still based on an apprenticeship model. To evaluate skill acquisition and apply targeted feedback to improve our training model, we applied the Structured Assessment of Microsurgery Skills (SAMS) to microsurgical fellows training. We hypothesized that subjects would demonstrate measurable improvement in performance throughout the study period and consistently across evaluators.
Methods
Seven fellows were evaluated during 118 microsurgical cases by 16 evaluators over three 1-month evaluation periods in 1 year (2010-2011). Evaluators used SAMS, which consists of 12 items in four areas: dexterity, visuo-spatial ability, operative flow, and judgment. To validate the SAMS data, microsurgical anastomoses in rodents performed by the fellows in a laboratory at the beginning and end of the study period were evaluated by five blinded plastic surgeons using the SAMS questionnaire. Primary outcomes were change in scores between evaluation periods and inter-evaluator reliability.
Results
Between the first two evaluation periods, all skill areas and overall performance significantly improved. Between the second two periods, most skill areas improved, but only a few significantly. Operative errors decreased significantly between the first and subsequent periods (81 vs. 36; p<0.05). In the laboratory study, all skills were significantly (p<0.05) or marginally (0.05≤p<0.10) improved between time points. The overall inter-evaluator reliability of SAMS was acceptable (α=0.67).
Conclusions
SAMS is a valid instrument for assessing microsurgical skill, providing individualized feedback with acceptable inter-evaluator reliability. Between the first two evaluation intervals, the microsurgical fellows’ skills increased significantly, but they plateaued thereafter. The use of SAMS is anticipated to enhance microsurgical training.