Computational models of selective spatial attention can reliably predict eye movements to complex images. However, researchers lack a simple way to measure covert representations of spatial attention in the brain and their link to overt eye movement behavior, especially in response to natural scenes. Here, we predict eye movement patterns from spatial priority maps reconstructed from brain activity measured with functional magnetic resonance imaging (fMRI). First, we define a computational spatial attention model using a deep convolutional neural network (CNN) pre-trained for scene categorization. Next, we decode CNN unit activity from fMRI activity and reconstruct spatial priority maps by applying our computational spatial attention model to decoded CNN activity. Finally, we predict eye movements in a subsequent behavioral experiment within and between individuals using reconstructed spatial priority maps. These results demonstrate that features represented in CNN unit activity can guide spatial attention and eye movements, providing a crucial link between CNN models, brain activity, and behavior.