In this paper, I set out to address moral philosophical issues, which arise when human nature undergoes changes and we eventually turn into cyborgs. We are now faced with the fact that we have become masters of evolution, even if we decide in general not to modify the body; it too is an active choice. When our body and brain are attached to technology it will gradually change our ways of being beings in the world. Furthermore, the possibility of technology enhancement also allows for variety among us. Therefore, it is my main concern to make aware of bodily presence within moral philosophy (MacIntyre, 2002) in order to shed light on the significance of embodiment for the development of a common understanding of the good. This background will function as a fundamental framework, which supports analysis about how cyborg ethics will relate to human ethics. Relating on the work of John Rawls, I will discuss conditions for social justice and equality in a future human and or cyborg society. In A Theory of Justice (1972), Rawls defines principles of justice for regulating an ideal society. He presents a model of a fair choice situation, which affords social justice by engaging participants to choose mutually acceptable principles of justice. In a hypothetical situation, participants are acting behind a “veil of ignorance” regarding their own social position in a future society. Given such circumstances, individuals will select principles in accordance with principles of justice. On this basis, I set out to explore the strength of this argument in a future human and or cyborg society.
The description of the conversion into cyborgs is laid out by Pearson (2000). The first step involves using gen technology in the development of Homo optimus. Next, with the help from bio-technology, we turn to homo cyberneticus and homo hybridus. Finally, with homo machinicus, we are faced with a species with no biological origin, heading for eternal life, an idea also reflected by Tipler in his book with the subtitle: Modern Cosmology, God and the Resurrection of the Dead (Tipler, 1994). Additionally, the idea that humanity should strive for enhancing itself is reflected in the philosophy of transhumanism, and also Moor argues in favour of enhancement – with reference to the autonomy of the responsible individual (2005, 9. 129). This is already taking place; people have implants for therapeutic purposes, and professor Kevin Warwich (2002, 2003) is actively experimenting with implementing technology into his own body. One of his experiments concerns the possibility of future distributed cognition and implies coupling machine and human nervous system (Warwick, 2003. P. 134). Warwick also touches upon ethical dilemmas, which arise from future scenarios, as well as from his own self-experimentation projects. Warwick outlines dilemmas, such as: should everybody have a right to upgrade to a cyborg? How about the clash between free will and computer control of thoughts? (Warwick, 2003, pp. 135-136). But he does not establish a framework for dealing with these issues. Rather than speculating further on the basis of empirical cases and future scenarios, I would like to call for reflection upon the moral philosophical consequences of evolving into species that are no longer characterized by the kind of sameness, which has been a fundamental human condition for homo-sapiens.
In his famous paper What Is It Like to Be a Bat? (1974), Nagel discusses the mind-body problem and criticizes physical theories of the mind for their reductionist approach to the explanation of our conscious experiences. He points to the fact that we are unable to consider the subjective character of experience without trying to imagine how it would be like to be a given experiential subject (Nagel, 1974, p. 166). Nagel then stipulates an objective phenomenology with the purpose of describing the subjective character of experience in a form comprehensible to beings incapable of having those experiences (Nagel, 1974, p. 166). In order to explain to a blind person what it is like to see, we should at first, well aware that something would still be left out, strive to develop a method allowing us to express in objective terms structural features of perception. Thus, to deal with the relation of mind to brain in the framework of a physical theory, we have to consider the problem regarding subjective and objective experiences.
I agree with Nagel that no reductionist analysis of mental states is fully able to explain the subjective character of experience – the fact that in order to have conscious experiences there must be something it is like to be a given organism (Nagel, 1994, p. 160). On the other hand, I disagree with Nagel on the idea of reaching an objective understanding of the mental through a structural feature analysis, which presumably would allow us to a grasp with greater precision the explanation of mental experiences. In the context of this paper, this disagreement causes me to explore how lack of sameness, with right to embodiment (Lakoff & Johnson, 1999) and vulnerability (MacIntyre, 2002) may challenge the development of future social interaction.
Lakoff, G. & Johnson, M. (1999), Philosophy in the Flesh – the embodied mind and its challenge to western thought. Basic Books, NY.
MacIntyre, A. (2002), Dependent Rational Animals – Why Human Beings Need the Virtues, Carus Puhlishing Company, Illinois.
Moor, J. H. (2005), Should We Let Computers Get Under Our Skin? In: (ed.) R. J. Cavalier, The impact of the internet on our moral lives. State University of New York Press, NY. (p. 121-138).
Nagel, T. (1974), What Is It Like to Be a Bat? In: Philosophical Review 83, pp. 435-450.
Pearson, I. (2000), The Future of Human Evolution (www.bt.com)
Rawls, J. (1972), A Theory of Justice. Oxford University Press, Oxford.
Tipler, F. J. (1994), The Physics of Immortality. Modern Cosmology – God and the Resurrection of the Dead, Doubleday.
Warwick, K. (2002), I, Cyborg, Century.
Warwick, K. (2003), Cyborg morals, cyborg values, cyborg ethics. In: Ethics and Information Technology, Vol. 5, No. 3, pp. 131-137.