This paper presents a novel hands-free human machine interface (HMI) for elderly and disabled people by fusing multi-modality bioinformation abstracted from forehead electromyography (EMG) signals and facial images of a user. The interface allows users to drive an electric-powered wheelchair using face movements such as jaw clenching and eye blinking. An indoor environment is set up for evaluating the application of this interface. Five intact subjects participated in the experiment to drive the intelligent wheelchair following designated routes and avoiding obstacles. Comparisons are made between this new interface and the traditional joystick control in terms of the easiness of control, travel time, wheelchair trajectory, and error command. The experimental results show that the proposed new control method is comparable to the joystick control method and can be used as a hands-free controller for the intelligent wheelchair.