An overview of the design and use of a Smart Prosthetic Hand is provided in this study. This study presents an overview of computational methods for brain, gesture, and voice signals to operate a robotic arm. There are several hundred million neurons in the human brain. A prosthetic arm powered by a Brain-Computer Interface (BCI) based on electroencephalograms (EEG) is a non-invasive approach that can be a powerful aid for persons with severe disabilities in their daily lives, particularly to let them use their arm voluntarily. The brain’s EEG data is recorded by the Brainsense headset and processed by a microprocessor to move the prosthetic hand via controlling servo motors. Additionally, a glove controller that replicates the gestures is used. The method entails using flex sensors to control the arm's movements. The voice control system was created so that voice commands could be sent through Bluetooth to control prosthetic arms. With the aid of a microcontroller, all actions can be monitored by a user interface. This prosthetic arm can help patients who have amputees below the elbow. This essay's main objective is to enable people with physical disabilities to become less dependent on others for their everyday needs. This model would be very helpful in both the real world, especially for those with disabilities who are unable to use their hands, and in the classroom when employed by college students pursuing robotics as a subject. The work given here is a mini-project that is taken up as a part of the curriculum completed by electronics and communication engineering students in the second year of the electronics & communication engineering department at Dayananda Sagar College of Engineering in Bangalore.