Computer classifiers have been successful at classifying various tasks using eye movement statistics. However, the question of human classification of task from eye movements has rarely been studied. Across two experiments, we examined whether humans could classify task based solely on the eye movements of other individuals. In Experiment 1, human classifiers were shown one of three sets of eye movements: Fixations, which were displayed as blue circles, with larger circles meaning longer fixation durations; Scanpaths, which were displayed as yellow arrows; and Videos, in which a neon green dot moved around the screen. There was an additional Scene manipulation in which eye movement properties were displayed either on the original scene where the task (Search, Memory, or Rating) was performed or on a black background in which no scene information was available. Experiment 2 used similar methods but only displayed Fixations and Videos with the same Scene manipulation. The results of both experiments showed successful classification of Search. Interestingly, Search was best classified in the absence of the original scene, particularly in the Fixation condition. Memory also was classified above chance with the strongest classification occurring with Videos in the presence of the scene. Additional analyses on the pattern of correct responses in these two conditions demonstrated which eye movement properties successful classifiers were using. These findings demonstrate conditions under which humans can extract information from eye movement characteristics in addition to providing insight into the relative success/failure of previous computer classifiers.