AUTHOR
Iordanis Kavathatzopoulos
ABSTRACT
Information technology has many advantages that can be used for the promotion of ethical competence. It saves time and space, it has an enormous memory storage capacity, it can process and reorganize information fast and reliably, etc. Recent technical developments in particular, which give us the possibility to construct advanced games and simulate the complexity of reality in micro-worlds, may further broaden the spectrum of opportunities and possibilities for support in ethical problem solving and decision making. In this paper our efforts to construct an ethical problem-solving support system and an ethical microworld are presented.
There are, however, certain important issues to consider before building such systems. The confounding of moral values with psychological processes can create many problems and sometimes makes it impossible (Blasi, 1980; Greene et al., 2004; Haidt, 2001; Jackson, 1994; Jaffee & Hyde, 2000). Accordingly, our theoretical basis is that successful information technology tools in ethics are those that are adapted exclusively on psychological problem-solving and decision-making processes.
When we are planning to use information technology tools to support ethical decision making we usually run the risk of disregarding the psychological skill aspects of ethical competence. The classical approach focuses normally on informing about moral philosophy, presenting lists of principles and stakeholder interests, or simply producing moral solutions based on predefined normative values (Collins & Miller, 1992; Gotterbarn & Rogerson, 2002; Pfeiffer, 1999). Creating and using information technology tools based primarily on this classical approach certainly has its strengths, but it also has many weaknesses (Winograd, 1995; Friedman, 2005).Ethical competence can be defined as based on the psychological ability described as autonomy. However, this skill is not so easy to use in real situations. Psychological research has shown that plenty of time and certain conditions are demanded before people can acquire and use the ethical ability of autonomy (Piaget, 1932; Kohlberg, 1985; Schwartz, 2000; Sunstein, 2005). When people face a moral problem they have great difficulties not confusing moral goals, values, feelings and emotions with the decision-making and problem-solving processes and the methods adopted for the solution of the problem. Usually, they do not clearly see the context of the problem nor do they analyze it in the same way they often do with problems of nature. In psychological theory this is described as the moral phase of heteronomy, which in contrast to autonomy, means that the individual does not use functional problem-solving strategies, that is, critical thinking. Autonomous and critical moral thinking is difficult, more difficult than autonomous technical thinking. In the searching to promote ethical competence we need to be assured that the autonomous ethical thinking is indeed stimulated by the support tools we use. Using information technology to support the acquisition and use of ethical autonomy is due to the special qualities and possibilities of this technology.
The use of real life simulations by decision makers may help them to learn easier how to handle morally complex and controversial situations satisfactorily. One way to do this is by connecting the progress of the simulation to the concrete way users treat moral problems rather than to general normative aspects of given solutions. For example, this can be done by incorporating in the simulation the interests, values, feelings, etc, of stakeholders whose reaction may influence the development of the simulation process.
Furthermore information technology tools have great advantages according to the hypothesis of autonomy. Their memory storage capacity is enormous. They are excellent in doing systematic work and analysis of data. Just by using them as a data base or an expert system in the effort to solve a concrete moral problem, the user can get information about certain values and interests, as well as about alternative ways of action, that otherwise might be overlooked. Reminiscence of the diversity, variety and complexity of the actual moral problem could effectively block decision makers’ natural tendency toward heteronomy, and stimulate autonomy.
The paper presents the structure and function of an ethical microworld simulation and of a support system in ethical problem-solving and decision-making. The ethical microworld simulation models realistic scenarios with interacting independent stakeholders. Users of the simulation are triggered to make autonomous decisions in dilemmas arising in the interaction between stakeholders. The goals are to investigate the possible approaches to implement the psychological approach to ethical problem-solving and decision-making, and to stimulate higher ethical competence.
The ethical support system is based on the theory of autonomy. By using it thinking is guided away from heteronomy and toward autonomy. Its basic features are: 1) not allow the user to use the system as a moral authority. 2) not present a ready made set of moral principles and values. 3) help the user to be unconstrained by moral fixations and authorities, 4) help the user to organize and analyze the facts, 5) help the user to weight the relevant values and principles against each other, 6) help the user to solve the moral problem at hand systematically, 7) force the user to motivate his/her decisions in regard to the relevant interests and values.
REFERENCES
Blasi, A. (1980). Bridging moral cognition and moral action: A critical review of the literature. Psychological Bulletin, 88, 1-45.
Collins, W. R. & Miller, K. (1992). A paramedic method for computing professionals. Journal of Systems and Software, 13, 47-84.
Friedman, B. (2005). Value sensitive design and information systems. Available: http://www.ischool.washington.edu/vsd/vsd-and-information-systems.pdf
Gotterbarn, D. & Rogerson, S. (2002). Project Planning Software [Computer software], East Tennessee State University.
Green, J.D., Nystrom, L.E., Engell, A.D., Darey, J.M., Cohen, J.D. (2004). The neural bases of cognitive conflict and control in moral judgment. Neuron, 44, 389-400.
Haidt, J. (2001). The emotional dog and its rational tail: A social intuitionist approach to moral judgment. Psychological Review, 108, 814-834.
Jackson, J. (1994). Coping with scepticism: About the philosopher’s role in teaching ethical business. Business Ethics: A European Review, 3, 171-173.
Jaffee, S. & Hyde, J. S. (2000). Gender differences in moral orientation: A meta-analysis. Psychological Bulletin, 126, 703-726.
Kohlberg, L. (1985). The Just Community: Approach to moral education in theory and practice. In M. Berkowitz and F. Oser (Eds.), Moral education: Theory and application (pp. 27-87). Lawrence Erlbaum Associates, Hillsdale, NJ.
Pfeiffer, R. S. (1999). Ethics on the job: Cases and strategies. Belmont, CA: Wadsworth.
Piaget, J. (1932). The moral judgment of the child. London: Routledge & Kegan Paul.
Schwartz, B. (2000). Self-determination: The tyranny of freedom. American Psychologist, 55, 79-88.
Sunstein, C. R. (2005). Moral heuristics. Behavioral and Brain Sciences, 28, 531-573.
Winograd, T. (1995). Computers, ethics and social responsibility. In D. G. Johnson and H. Nissebaum (Eds.) Computers, Ethics and Social Values (pp. 25-39). Upper Saddle River, NJ: Prentice Hall.