Current models of decision-making assume that the brain gradually accumulates evidence and drifts towards a threshold which, once crossed, results in a choice selection. These models have been especially successful in primate research, however transposing them to human fMRI paradigms has proved challenging. Here, we exploit the face-selective visual system and test whether decoded emotional facial features from multivariate fMRI signals during a dynamic perceptual decision-making task are related to the parameters of computational models of decision-making. We show that trial-bytrial variations in the pattern of neural activity in the fusiform gyrus reflect facial emotional information and modulate drift rates during deliberation. We also observed an inverse-urgency signal based in the caudate nucleus that was independent of sensory information but appeared to slow decisions, particularly when information in the task was ambiguous. Taken together, our results characterize how decision parameters from a computational model (i.e., drift rate and urgency signal) are involved in perceptual decision-making and reflected in the activity of the human brain.YY and AD designed research; YY and MT collected the data; YY, MD, and YZ analyzed the data and contributed methods; and YY, AD, LF, and PC contributed to result interpretation. YY drafted the initial manuscript; all authors contributed to writing of this manuscript.