Purpose: Finding effective methods of discriminating surgeon technical skill has proven a complex problem to solve computationally. Previous research has shown that obtaining non-expert crowd evaluations of surgical performances is as accurate as the gold standard, expert surgeon review [1]. The aim of this research is to learn whether crowdsourced evaluators give higher ratings of technical skill to video of performances with increased playback speed, its effect in discriminating skill levels, and whether this increase is related to the evaluator consciously being aware that the video is being manually manipulated. Methods: A set of ten peg transfer videos (5 novices, 5 experts), were used to evaluate the perceived technical skill of the performers at each video playback speed used (0.4x-3.6x). Objective metrics used for measuring technical skill were also computed for comparison by manipulating the corresponding kinematic data of each performance. Two videos of an expert and novice performing dry lab laparoscopic trials of peg transfer tasks were used to obtain evaluations at each playback speed (0.2x-3.0x) of perception of whether a video is played at real-time playback speed or not. Results: We found that while both novices and experts had increased perceived technical skill as the video playback was increased, the amount of increase was significantly greater for experts. Each increase in the playback speed by 0.4x was associated with, on average, a 0.72-point increase in the GOALS score (95% CI: 0.60-0.84 point increase; p < 0.001) for expert videos and only a 0.24-point increase in the GOALS score (95% CI: 0.13-0.36 point increase; p < 0.001) for novice videos. Conclusion: Due to the differential increase in perceived technical skill due to increased playback speed for experts, the difference between novice and expert skill levels of surgi-