We propose that moral competence consists of five distinct but related elements: (1) having a system of norms;(2) mastering a moral vocabulary; (3) exhibiting moral cognition and affect; (4) exhibiting moral decision making and action; and (5) engaging in moral communication. We identify some of the likely triggers that may convince people to (justifiably) ascribe each of these elements of moral competence to robots. We suggest that humans will treat robots as moral agents (who have some rights, obligations, and are targets of blame) if they perceive them to have at least elements (1) and(2) and one or more of elements (3)-(5). ). examine whether robots with such emerging moral competence are in fact suitable and accepted as social partners [4]. The moral standing and abilities of machines will therefore emerge from, and be in part constrained by, the relations that people are willing to form with them [5].
II. ELEMENTS OF MORAL COMPETENCE
A. The Framework in OverviewA competence is an aptitude, a qualification, a dispositional capacity to deal adequately with certain tasks. Uncontroversially, moral competence must deal with the task of moral decision making and action. From Aristotle to Kant to Kohlberg, morality has been about "doing the right thing." Similarly, recent questions about moral properties of robots have centered on decisions about life and death [6], often in action dilemmas [7], which have prominence in psychology and cognitive science as well [8]- [10].