The long-standing rationalist tradition in moral psychology emphasizes the role of reason in moral judgment. A more recent trend places increased emphasis on emotion. Although both reason and emotion are likely to play important roles in moral judgment, relatively little is known about their neural correlates, the nature of their interaction, and the factors that modulate their respective behavioral influences in the context of moral judgment. In two functional magnetic resonance imaging (fMRI) studies using moral dilemmas as probes, we apply the methods of cognitive neuroscience to the study of moral judgment. We argue that moral dilemmas vary systematically in the extent to which they engage emotional processing and that these variations in emotional engagement influence moral judgment. These results may shed light on some puzzling patterns in moral judgment observed by contemporary philosophers.
Traditional theories of moral psychology emphasize reasoning and "higher cognition," while more recent work emphasizes the role of emotion. The present fMRI data support a theory of moral judgment according to which both "cognitive" and emotional processes play crucial and sometimes mutually competitive roles. The present results indicate that brain regions associated with abstract reasoning and cognitive control (including dorsolateral prefrontal cortex and anterior cingulate cortex) are recruited to resolve difficult personal moral dilemmas in which utilitarian values require "personal" moral violations, violations that have previously been associated with increased activity in emotion-related brain regions. Several regions of frontal and parietal cortex predict intertrial differences in moral judgment behavior, exhibiting greater activity for utilitarian judgments. We speculate that the controversy surrounding utilitarian moral philosophy reflects an underlying tension between competing subsystems in the brain.
Cooperation is central to human social behaviour. However, choosing to cooperate requires individuals to incur a personal cost to benefit others. Here we explore the cognitive basis of cooperative decision-making in humans using a dual-process framework. We ask whether people are predisposed towards selfishness, behaving cooperatively only through active self-control; or whether they are intuitively cooperative, with reflection and prospective reasoning favouring 'rational' self-interest. To investigate this issue, we perform ten studies using economic games. We find that across a range of experimental designs, subjects who reach their decisions more quickly are more cooperative. Furthermore, forcing subjects to decide quickly increases contributions, whereas instructing them to reflect and forcing them to decide slowly decreases contributions. Finally, an induction that primes subjects to trust their intuitions increases contributions compared with an induction that promotes greater reflection. To explain these results, we propose that cooperation is intuitive because cooperative heuristics are developed in daily life where cooperation is typically advantageous. We then validate predictions generated by this proposed mechanism. Our results provide convergent evidence that intuition supports cooperation in social dilemmas, and that reflection can undermine these cooperative impulses.
Traditional theories of moral development emphasize the role of controlled cognition in mature moral judgment, while a more recent trend emphasizes intuitive and emotional processes. Here we test a dual-process theory synthesizing these perspectives. More specifically, our theory associates utilitarian moral judgment (approving of harmful actions that maximize good consequences) with controlled cognitive processes and associates non-utilitarian moral judgment with automatic emotional responses. Consistent with this theory, we find that a cognitive load manipulation selectively interferes with utilitarian judgment. This interference effect provides direct evidence for the influence of controlled cognitive processes in moral judgment, and utilitarian moral judgment more specifically.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.