Abstract. This paper proposes a new human-machine interface particularly conceived for people with severe disabilities (specifically tetraplegic people), that allows them to interact with the computer for their everyday life by means of mouse pointer. In this system, called FaceMouse, instead of classical "pointer paradigm" that requires the user to look at the point where to move, we propose to use a paradigm called "derivative paradigm", where the user does not indicate the precise position, but the direction along which the mouse pointer must be moved. The proposed system is composed of a common, lowcost webcam, and by a set of computer vision techniques developed to identify the parts of the user's face (the only body part that a tetraplegic person can move) and exploit them for moving the pointer. Specifically, the implemented algorithm is based on template matching to track the nose of the user and on cross-correlation to calculate the best match. Finally, several real applications of the system are described and experimental results carried out by disabled people are reported.