Abstract-The real-time adaptation between human and assistive devices can improve the quality of life for amputees, which, however, may be difficult to achieve since physical and mental states vary over time. This paper presents a co-adaptive humanmachine interface (HMI) that is developed to control virtual forearm prosthesis over a long period of operation. Direct physical performance measures for the requested tasks are calculated. Bioelectric signals are recorded using one pair of electrodes placed on the frontal face region of a user to extract the mental (affective) measures (the entropy of the alpha band of the forehead electroencephalography signals) while performing the tasks. By developing an effective algorithm, the proposed HMI can adapt itself to the mental states of a user, thus improving its usability. The quantitative results from 16 users (including an amputee) show that the proposed HMI achieved better physical performance measures in comparison with the traditional (nonadaptive) interface (p-value < 0:001). Furthermore, there is a high correlation (correlation coefficient < 0:9; p-value < :01) between the physical performance measures and self-report feedbacks based on the NASA TLX questionnaire. As a result, the proposed adaptive HMI outperformed a traditional HMI.