Objective: The increase in minimally invasive surgery and endovascular procedures has led to a decrease in surgical experience. This may adversely affect both surgical training and postoperative management. Since it poses no risk to a patient, simulation training may be a solution these problems. COVID-19 requires social distancing which has created a negative impact on the simulation educational environment. To date, there is only limited research examining whether skills are evaluated objectively and equally in simulation training, especially in microsurgery. The purpose of this study was to analyze the objectivity and equality of simulation evaluation results conducted in a contest format.Methods: A nationwide recruitment process was conducted to select study participants. Participants were recruited from a pool of qualified physicians with less than 10 years’ experience. In this study, the simulation procedure consisted of incising a 1 mm thick artificial blood vessel and suturing it with a 10-0 thread using a surgical microscope. To evaluate the simulation procedures, a scoring chart was developed with a maximum of 5 points each for eight different evaluation criteria. Five neurosurgical supervisors from different hospitals were asked to use this scoring chart to grade the simulation proceduresResults: Initially, we planned to have the neurosurgical supervisors score the simulation procedure by direct observation. However, due to COVID-19 some study participants were unable to attend. Thus requiring some simulation procedures to be scored by video review. A total of 14 trainees participated in the study. The Cronbach's alpha coefficient among the scorers was 0.99, indicating a strong correlation. There was no statistically significant difference between the scores from the video review and direct observation judgments. There was a statistically significant difference (p < 0.001) between the scores for some criteria. For the eight criteria, individual scorers assigned scores in a consistent pattern. However, this pattern differed between scorers indicating that some scorers were more lenient than others.Conclusions: The results of this study indicate that both video review and direct observation methods are useful and highly objective techniques evaluate simulation procedures. Despite differences in score assignment patterns between individual scorers.