Moral decision making in autonomous systems: Enforcement, moral emotions, dignity, trust, and deception

Ronald Craig Arkin, Patrick Ulam, Alan R. Wagner

Research output: Contribution to journalArticlepeer-review

135 Scopus citations


As humans are being progressively pushed further downstream in the decision-making process of autonomous systems, the need arises to ensure that moral standards, however defined, are adhered to by these robotic artifacts. While meaningful inroads have been made in this area regarding the use of ethical lethal military robots, including work by our laboratory, these needs transcend the warfighting domain and are pervasive, extending to eldercare, robot nannies, and other forms of service and entertainment robotic platforms. This paper presents an overview of the spectrum and specter of ethical issues raised by the advent of these systems, and various technical results obtained to date by our research group, geared towards managing ethical behavior in autonomous robots in relation to humanity. This includes: 1) the use of an ethical governor capable of restricting robotic behavior to predefined social norms; 2) an ethical adaptor which draws upon the moral emotions to allow a system to constructively and proactively modify its behavior based on the consequences of its actions; 3) the development of models of robotic trust in humans and its dual, deception, drawing on psychological models of interdependence theory; and 4) concluding with an approach towards the maintenance of dignity in human-robot relationships.

Original languageEnglish (US)
Article number6099675
Pages (from-to)571-589
Number of pages19
JournalProceedings of the IEEE
Issue number3
StatePublished - Mar 2012

All Science Journal Classification (ASJC) codes

  • General Computer Science
  • Electrical and Electronic Engineering


Dive into the research topics of 'Moral decision making in autonomous systems: Enforcement, moral emotions, dignity, trust, and deception'. Together they form a unique fingerprint.

Cite this