The development of assistive robots is gaining momentum in the robotic and biomedical fields. An assistive robotic system is presented in this research paper for object manipulation to aid people with physical disabilities. The robotic arm design is imported to a simulated environment and tested in a virtual world. This research includes the development of a versatile design and testing platform for robotic applications with joint torque requirements, workspace restrictions, and control tuning parameters. Live user inputs and camera feeds are used to test the movement of the robot in the virtual environment. To create the environment and user interface, a robot operating system (ROS) is used. Live brain computer interface (BCI) commands from a trained user are successfully harvested and used as an input signal to pick a goal point from 3D point cloud data and calculate the goal position of the robots’ mobile base, placing the goal point in the robot arms workspace. The platform created allows for quick design iterations to meet different application criteria and tuning of controllers for desired motion.
The paper describes the design of a hybrid Brain Computer Interface (BCI) system that provides control commands to manipulate a robotic arm. The goal is to facilitate BCI controlled real-time robotic applications by using a semi-autonomous operation mode that accepts multiple commands. Simple tasks, like moving forward or turning, are executed based on a single BCI command while more complicated tasks, like grabbing or pushing an object are automated once the task is selected. The robotic arm vision system uses an Intel RealSense D435 camera for image and depth perception where a point cloud generates an interface for the user to select an object. The user selects an object to manipulate, which identifies the goal position and location. After the location and object is determined, the software interface moves the robotic arm to have the selected object in the robotic arms’ workspace. The current robotic arm design utilizes an open bionics Brunel dexterous hand as the end effector which allows for human-like hand actions. A simulation platform is developed to verify the effect of the entire system of a dexterous robotic arm on a mobile platform. The system design and results using a hybrid BCI system is demonstrated.
The proposed assistive hybrid brain-computer interface (BCI) semiautonomous mobile robotic arm demonstrates a design that is (1) adaptable by observing environmental changes with sensors and deploying alternate solutions and (2) versatile by receiving commands from the user’s brainwave signals through a noninvasive electroencephalogram cap. Composed of three integrated subsystems, a hybrid BCI controller, an omnidirectional mobile base, and a robotic arm, the proposed robot has commands mapped to the user’s brainwaves related to a set of specific physical or mental tasks. The implementation of sensors and the camera systems enable both the mobile base and the arm to be semiautonomous. The mobile base’s SLAM algorithm has obstacle avoidance capability and path planning to assist the robot maneuver safely. The robot arm calculates and deploys the necessary joint movement to pick up or drop off a desired object selected by the user via a brainwave controlled cursor on a camera feed. Validation, testing, and implementation of the subsystems were conducted using Gazebo. Communication between the BCI controller and the subsystems is tested independently. A loop of prerecorded brainwave data related to each specific task is used to ensure that the mobile base command is executed; the same prerecorded file is used to move the robot arm cursor and initiate a pick-up or drop-off action. A final system test is conducted where the BCI controller input moves the cursor and selects a goal point. Successful virtual demonstrations of the assistive robotic arm show the feasibility of restoring movement capability and autonomy for a disabled user.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.