Neurodegenerative diseases present significant challenges in terms of mobility and autonomy for patients. In the current context of technological advances, brain–computer interfaces (BCIs) emerge as a promising tool to improve the quality of life of these patients. Therefore, in this study, we explore the feasibility of using low-cost commercial EEG headsets, such as Neurosky and Brainlink, for the control of robotic arms integrated into autonomous wheelchairs. These headbands, which offer attention and meditation values, have been adapted to provide intuitive control based on the eight EEG signal values read from Delta to Gamma (high and low/medium Gamma) collected from the users’ prefrontal area, using only two non-invasive electrodes. To ensure precise and adaptive control, we have incorporated a neural network that interprets these values in real time so that the response of the robotic arm matches the user’s intentions. The results suggest that this combination of BCIs, robotics, and machine learning techniques, such as neural networks, is not only technically feasible but also has the potential to radically transform the interaction of patients with neurodegenerative diseases with their environment.