While it is known that the same brain area could be involved in multiple functions, such multimodality has yet to be utilized in applications like brain-computer interfaces (BCI). For instance, could the same BCI decode both hand movements and speech? Here we studied stereo EEG (sEEG) patterns in two patients with epilepsy performing motor and language tasks as parts of the same experimental session. sEEG electrodes were implanted in various regions of frontal and temporal cortices. In the motor task, the patients wrote digits by hand whereas in the language task they pronounced or imagined pronouncing words. The superior frontal gyrus (SFG) and superior temporal gyrus (STG) were engaged in both tasks whereas the middle frontal gyrus (MFG) and middle temporal gyrus (MTG) were engaged only in the handwriting task. In addition to task-execution neural patterns, preparatory activity was observed, particularly the STG. Based on the differences in the STG and SFG, the articulatory versus imagined speech could be decoded using a machine learning classifier. We suggest that multimodal BCIs be used in the future to improve speech restoration and rehabilitation in neurological patients.