Perceptual decision making is the act of choosing one option or course of action from a set of alternatives on the basis of available sensory evidence. Thus, when we make such decisions, sensory information must be interpreted and translated into behaviour. Neurophysiological work in monkeys performing sensory discriminations, combined with computational modelling, has paved the way for neuroimaging studies that are aimed at understanding decision-related processes in the human brain. Here we review findings from human neuroimaging studies in conjunction with data analysis methods that can directly link decisions and signals in the human brain on a trial-by-trial basis. This leads to a new view about the neural basis of human perceptual decision-making processes.
Findings from single-cell recording studies suggest that a comparison of the outputs of different pools of selectively tuned lower-level sensory neurons may be a general mechanism by which higher-level brain regions compute perceptual decisions. For example, when monkeys must decide whether a noisy field of dots is moving upward or downward, a decision can be formed by computing the difference in responses between lower-level neurons sensitive to upward motion and those sensitive to downward motion. Here we use functional magnetic resonance imaging and a categorization task in which subjects decide whether an image presented is a face or a house to test whether a similar mechanism is also at work for more complex decisions in the human brain and, if so, where in the brain this computation might be performed. Activity within the left dorsolateral prefrontal cortex is greater during easy decisions than during difficult decisions, covaries with the difference signal between face- and house-selective regions in the ventral temporal cortex, and predicts behavioural performance in the categorization task. These findings show that even for complex object categories, the comparison of the outputs of different pools of selectively tuned neurons could be a general mechanism by which the human brain computes perceptual decisions.
Empathy is a multidimensional construct consisting of cognitive (inferring mental states) and emotional (empathic concern) components. Despite a paucity of research, individuals on the autism spectrum are generally believed to lack empathy. In the current study we used a new, photo-based measure, the Multifaceted Empathy Test (MET), to assess empathy multidimensionally in a group of 17 individuals with Asperger syndrome (AS) and 18 well-matched controls. Results suggested that while individuals with AS are impaired in cognitive empathy, they do not differ from controls in emotional empathy. Level of general emotional arousability and socially desirable answer tendencies did not differ between groups. Internal consistency of the MET's scales ranged from .71 to .92, and convergent and divergent validity were highly satisfactory.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.