Moral Appearances: Emotions, Robots, and Human Morality.

Mark Coeckelbergh

Research output: Contribution to journalArticleAcademicpeer-review

87 Citations (Scopus)
17 Downloads (Pure)

Abstract

Can we build ‘moral robots’? If morality depends on emotions, the answer seems negative. Current robots do not meet standard necessary conditions for having emotions: they lack consciousness, mental states, and feelings. Moreover, it is not even clear how we might ever establish whether robots satisfy these conditions. Thus, at most, robots could be programmed to follow rules, but it would seem that such ‘psychopathic’ robots would be dangerous since they would lack full moral agency. However, I will argue that in the future we might nevertheless be able to build quasi-moral robots that can learn to create the appearance of emotions and the appearance of being fully moral. I will also argue that this way of drawing robots into our social-moral world is less problematic than it might first seem, since human morality also relies on such appearances
Original languageEnglish
Pages (from-to)235-241
Number of pages7
JournalEthics and information technology
Volume12
Issue number3
DOIs
Publication statusPublished - 2010

Keywords

  • METIS-269280
  • IR-76107

Fingerprint

Dive into the research topics of 'Moral Appearances: Emotions, Robots, and Human Morality.'. Together they form a unique fingerprint.

Cite this