Tetraplegia from spinal cord injury leaves many patients paralyzed below the neck, leaving them unable to perform most activities of daily living. Brain-machine interfaces (BMIs) could give tetraplegic patients more independence by directly utilizing brain signals to control external devices such as robotic arms or hands. The cortical grasp network has been of particular interest because of its potential to facilitate the restoration of dexterous object manipulation. However, a network that involves such high-level cortical areas may also provide additional information, such as the encoding of speech. Towards understanding the role of different brain areas in the human cortical grasp network, neural activity related to motor intentions for grasping and performing speech was recorded in a tetraplegic patient in the supramarginal gyrus (SMG), the ventral premotor cortex (PMv), and the somatosensory cortex (S1). We found that in high-level brain areas SMG and PMv, grasps were well represented by firing rates of neuronal populations already at visual cue presentation. During motor imagery, grasps could be significantly decoded from all brain areas. At identical neuronal population sizes, SMG and PMv achieved similar highly-significant decoding abilities, demonstrating their potential for grasp BMIs. During speech, SMG encoded both spoken grasps and colors, in contrast to PMv and S1, which were not able to significantly decode speech.These findings suggest that grasp signals can robustly be decoded at a single unit level from the cortical grasping circuit in human. Data from PMv suggests a specialized role in grasping, while SMG’s role is broader and extends to speech. Together, these results indicate that brain signals from high-level areas of the human cortex can be exploited for a variety of different BMI applications.