This research study proposes a unique framework that takes input from a surface electromyogram (sEMG) and functional near-infrared spectroscopy (fNIRS) bio-signals. These signals are trained using convolutional neural networks (CNN). The framework entails a real-time neuro-machine interface to decode the human intention of upper limb motions. The bio-signals from the two modalities are recorded for eight movements simultaneously for prosthetic arm functions focusing on trans-humeral amputees. The fNIRS signals are acquired from the human motor cortex, while sEMG is recorded from the human bicep muscles. The selected classification and command generation features are the peak, minimum, and mean ΔHbO and ΔHbR values within a 2-s moving window. In the case of sEMG, wavelength, peak, and mean were extracted with a 150-ms moving window. It was found that this scheme generates eight motions with an enhanced average accuracy of 94.5%. The obtained results validate the adopted research methodology and potential for future real-time neural-machine interfaces to control prosthetic arms.