The novel eye-based human-computer interaction (HCI) system aims to provide people, especially, disabled persons, a new way of communication with surroundings. It adopts a series of continual eye movements as input to perform simple control activities. Identification of eye movements is the crucial technology in these eye-based HCI systems. At present, researches on eye movement identification mainly focus on frontal face images. In fact, acquisition of non-frontal face images is more reasonable in real applications. In this paper, we discuss the identification process of eye movements from non-frontal face images. Firstly, the original head-shoulder images of 0 • -±60 • azimuths are sampled without any auxiliary light source. Secondly, the non-frontal face region is detected by using the Adaboost cascade classifiers. After that, we roughly extract eye windows by the integral projection function. Then, we propose a new method to calculate the x − y coordinates of the pupil center point by searching the minimal intensity value in the eye windows. According to the trajectory of the pupil center points, different eye movements (eye moving left, right, up or down) are successfully identified. A set of experiments is presented.