Harold Thimbleby and Penny Duquenoy
Bounds Green Road,
Justice is about doing good for other people. It is clear that computers could be better – this paper therefore makes a start by looking at the questions of what “just programming” means, and how aiming for justice might impact how we program.
Computers have transformed society, are still transforming society, and will continue to do so for the foreseeable future. The vast quantities of information easily accessible to all of us, the easy ways of creating and disseminating new information – such as using email, word processors, spreadsheets and the mobile web – are all transforming work practices and life generally. Computer games to international financial systems increasingly define our culture.
Yet everywhere we turn we find malfunctioning computer programs. PC applications programs are notorious for crashing. Frequently, the human-computer system as a pair crashes, with the users left at a loss of what to do next. Industry wastes millions of pounds in time lost when people need to get advice and help on how to continue working with their unreliable and incomprehensible computers. The Year 2000 Bug, as it happened, turned out not to be the wholesale catastrophe iconoclasts predicted, but the hype played on the well-founded fears of society’s dependence on such a fragile technology. More money was spent fixing the bug that on any other single problem ever to face humanity.
Why does society put up with the unreliability of computers, especially when they are presented as essential solutions for almost every problem? Why do we discount the cost of Y2k fixes? Why are books called “Computers for Dummies” and “Idiot’s guides” so popular? Why do we buy software that has no warranties?
Programming is an activity whose purpose is to affect other people, which it does through programmed devices. This paper brings together a range of issues around programming and argues that it is both a political and an ethical activity, and has been studied as such (though under the banner of human computer interaction, etc). We know from the regular failures of computers, whether in the besetting, daily problems with desktop systems, or from the embarrassing failures of megaprojects, that computers are not automatically beneficial in every way. Achieving worthwhile ends has to be achieved despite the risks of woe and catastrophe; understanding how to build better computers is an ethical obligation. Programming computers so that they more often have a beneficial impact is not easy, and anyway not always a conscious goal of development. Programming will not get better automatically. Changing the organisations and processes wherein programming takes place must also be encouraged.
Focussing on identified areas of HCI (Human Computer Interaction) such as usability in terms of applied ethics can provide helpful insights. Usability is a useful way of being precise about what key problems are. There are ISO standards. It is clear with this more precise focus, for instance, that users buying idiot’s books is a symptom, not a part of the problems. It is clear, for instance, that measuring user performance contributes to making improvements. The fields of usability divide fairly cleanly corresponding to their ethical signatures: this is not surprising, since usability is about making computers better, and ethics is about what better itself means.
There is an even stronger connection between programming and ethics. We write a program and other people use it. What they can or cannot do is determined by our programs; what their clients can and cannot do is determined by our program’s functions and features. In short, there is a direct step from program code to social codes. Sometimes the connection is very straight forward: a database for airline tickets “knows” all the costs, but is programmed so that web customers cannot find cheapest flights. Sometimes the connection is more apparent in criminal activity: when hackers insert code to benefit themselves or get their own back on organisations.
Programming a system to be usable is about making the system good for the people who use it. Aristotle (1990) defined justice as doing good for other people, so promoting usability is promoting justice.
There are numerous big issues – from pornography, privacy, to nuclear power station safety – which clearly raise ethical issues, of world wide (web) scale, and, furthermore, created by computers, and hence by our programming decisions. These issues must be welcomed as ethical consciousness-raising, and moreover as ones in which computer programmers have a direct and central influence. It is, of course, possible to program ignoring ethics, but this does not make the issues go away: it means, rather, that poor decisions will be “hard coded” regardless of their impact. This in turn will encourage bad social practices and then bad laws (consider the government influence over e- and m-commerce).
The idea that there is a right or wrong way of programming is unsophisticated. Programmers must appreciate the importance of the issues, and the ways in which justice can be woven into everyday practice, even down to coding styles. It is motivating (if not scary) that the leverage programming gives our efforts is enormous: even applets can be used by thousands daily. This should be enough to motivate us to seek that better world, where as-yet unknown users are empowered, which is the point of programming.