brain-machine interface (BMI) is about transforming neural activity into action and sensation into perception (Figure 1). In a BMI system, neural signals recorded from the brain are fed into a decoding algorithm that translates these signals into motor outputs to control a variety of practical devices for motor-disabled people [1]- [5]. Feedback from the prosthetic device, conveyed to the user either via normal sensory pathways or directly through brain stimulation, establishes a closed control loop.An important aspect of a BMI is the capability to distinguish between different patterns of brain activity, with each being associated with a particular intention or mental task. Hence, adaptation is a key component of a BMI because, on the one hand, users must learn to modulate their neural activity to generate distinct brain patterns, while, on the other hand, machine-learning techniques need to discover the individual brain patterns characterizing the mental tasks executed by the user. In essence, a BMI is a two-learner system that must engage in a mutual adaptation process [6], [7].Future neuroprosthetics-robots and exoskeletons controlled via a BMI-will be tightly coupled with the user in such a way that the resulting system can replace and restore impaired limb functions because they are controlled by the same neural signals as their natural counterparts. However, the robust and natural interaction of subjects with prostheses over long periods of time remains a major challenge. To tackle this challenge, we can take inspiration from natural motor control, where