The ability to grasp and manipulate objects requires controlling both finger movement kinematics and isometric force in rapid succession. Previous work suggests that these behavioral modes are controlled separately, but it is unknown whether the cerebral cortex represents them differently. Here, we asked the question of how movement and force were represented cortically, when executed sequentially with the same finger. We recorded high-density electrocorticography (ECoG) from the motor and premotor cortices of seven human subjects performing a movement-force motor task. We decoded finger movement [0.7 6 0.3 fractional variance accounted for (FVAF)] and force (0.7 6 0.2 FVAF) with high accuracy, yet found different spatial representations. In addition, we used a stateof-the-art deep learning method to uncover smooth, repeatable trajectories through ECoG state space during the movement-force task. We also summarized ECoG across trials and participants by developing a new metric, the neural vector angle (NVA). Thus, state-space techniques can help to investigate broad cortical networks. Finally, we were able to classify the behavioral mode from neural signals with high accuracy (90 6 6%). Thus, finger movement and force appear to have distinct representations in motor/premotor cortices. These results inform our understanding of the neural control of movement, as well as the design of grasp brain-machine interfaces (BMIs).