Artificial Neural networks (ANN) trained on complex tasks are increasingly used in neuroscience to model brain dynamics, a process called brain encoding. Videogames have been extensively studied in the field of artificial intelligence, but have hardly been used yet for brain encoding. Videogames provide a promising framework to understand brain activity in a rich, engaging, and active environment. A major challenge raised by complex videogames is that individual behavior is highly variable across subjects, and we hypothesized that ANNs need to account for subject-specific behavior in order to properly capture brain dynamics. In this study, we used ANNs to model functional magnetic resonance imaging (fMRI) and behavioral gameplay data, both collected while subjects played the Shinobi III videogame. Using imitation learning, we trained an ANN to play the game while closely replicating the unique gameplay style of individual participants. We found that hidden layers of our imitation learning model successfully encoded task-relevant neural representations, and predicted individual brain dynamics with higher accuracy than models trained on other subjects’ gameplay or control models. The highest correlations between layer activations and brain signals were observed in biologically plausible brain areas, i.e. somatosensory, attention, and visual networks. Our results highlight the potential of combining imitation learning, brain imaging, and videogames to uncover idiosyncratic aspects of behavior and how those relate with individual brain activity.