The principle of formal equality, one of the most fundamental and undisputed principles in ethics, states that a difference in treatment or value between two kinds of entities can only be justified on the basis of a relevant and significant difference between the two. Accordingly, when it comes to the question of what kind of moral claim an intelligent or autonomous machine might have, one way to answer this is by way of comparison with humans: Is there a fundamental difference between humans and machines that justifies unequal treatment, or will the two become increasingly continuous, thus making it increasingly dubious whether unequal treatment is justified? This question is inherently imprecise, however, because it presupposes a stance on what it means for two types of entities to be sufficiently similar, as well as which types of properties that are relevant to compare. In this paper, I will sketch a formal characterization of what it means for two types of entities to be continuous in this sense, discuss what it implies for two different types of entities to be (dis-)continuous with regard to both ethics and science, and discuss a dramatic difference in how two previously discontinuous entities might become continuous.
|Title of host publication||AISB/IACAP World Congress: The machine question: AI, ethics and moral responsibility|
|Editors||David Gunkel, Joanna Bryson, Steve Torrance|
|Place of Publication||Birmingham (UK)|
|Publisher||The Society for the Study of Artificial Intelligence and Simulation of Behaviour (AISB)|
|Publication status||Published - 2 Jul 2012|
|Event||AISB/IACAP World Congress 2012 - Birmingham, United Kingdom|
Duration: 2 Jul 2012 → 6 Jul 2012
|Publisher||The Society for the Study of Artificial Intelligence and Simulation of Behaviour|
|Conference||AISB/IACAP World Congress 2012|
|Period||2/07/12 → 6/07/12|
Soraker, J. (2012). Is there a continuity between man and machine? In D. Gunkel, J. Bryson, & S. Torrance (Eds.), AISB/IACAP World Congress: The machine question: AI, ethics and moral responsibility (pp. 78-82). Birmingham (UK): The Society for the Study of Artificial Intelligence and Simulation of Behaviour (AISB).