A home-auxiliary robot system based on characteristics of the electrooculogram (EOG) and tongue signal is developed in the current study, which can provide daily life assistance for people with physical mobility disabilities. It relies on five simple actions (blinking twice in a row, tongue extension, upward tongue rolling, and left and right eye movements) of the human head itself to complete the motions (moving up/down/left/right and double-click) of a mouse in the system screen. In this paper, the brain network and BP neural network algorithms are used to identify these five types of actions. The result shows that, for all subjects, their average recognition rates of eye blinks and tongue movements (tongue extension and upward tongue rolling) were 90.17%, 88.00%, and 89.83%, respectively, and after training, the subjects can complete the five types of movements in sequence within 12 seconds. It means that people with physical disabilities can use the system to quickly and accurately complete life self-help, which brings great convenience to their lives.