Policies for a ‘nanosociety’: Can we learn now from our future mistakes?

David S. Horner


Controversies over the future of nanoscience and nanotechnology indicate the need for ethically driven policies. The paper will consider whether or not James Moor’s just-consequentialist theory (Moor 1999) provides a practical and theoretically sound response to such a need. It will take a sceptical view of any attempt to justify policies on the strength of forecasting consequences. It will argue that a fallibilist approach is sympathetic to the reconceptualisation of ‘political ecology’ suggested by recent work in the social studies of science and emphasise the need for the democratisation of technological decision making.

The paper will review recent reports from the Royal Society, the ESRC and Greenpeace, for example, which explore a wide range of views on a future ‘nano society’. These suggest a wide disparity in assessments of a future shaped by nanotechnology and the recognition that policy should be shaped to avert unintended and undesirable consequences through public debate. An ESRC report (Wood, Jones and Geldart, 2003) on the social and economic challenges of nanotechnology identifies a continuum of evaluations of the impacts of nanotechnology from incremental (the continuation of existing research and development directions), through evolutionary (the scaling down of existing technologies towards the nanoscale) to very radical implications (fully functional nanoscale machines envisaged by Drexler in the 1980s). A Royal Society report indicates that the social and ethical issues raised by nanotechnology includes, amongst others, major economic impacts; the opening up of a ‘nanodivide’ intensifying the gap between rich and poor countries; information collection and the implications for civil liberties; ethical implications of human ‘enhancements’; and the potential for significant military applications. The report argues that even given the wide range of assessments ‘?what does seem clear is that genuinely new and/or unanticipated social or ethical issues are likely to be associated with radical disjunctions if they occur’ (The Royal Society, 2004, p.51). Similarly Greenpeace emphasises the characterisation of nanotechnology as a ‘disruptive technology’ and ‘?the need for wider participation in the control and direction of technological innovation’ (Arnall, 2003, p.4). What emerges then is a need to address the ‘policy vacuums’ created by the emergence of such ‘radical’, ‘disjunctive’ and ‘disruptive’ science and technology.

Nanotechnology is a ‘malleable’ technology, in Moor’s sense, in that it may be used in novel and unexpected ways which may escape our current methods of regulation or policies for controlling its use. The paper will evaluate Moor’s framework for tackling ethical decision making about such technologies. For Moor ‘?a basic job of computer ethics is to identify ? policy needs, clarify related conceptual confusions, formulate appropriate new policies, and ethically justify them’ (Moor, 1999, p.65). His framework seeks to combine ‘?considerations of consequences of action with more traditional deontological considerations of duties, rights, and justice’ to ‘?provide us a defensible ethical theory that yields a useful framework of applied ethics’ (Tavani, 2004, p.59). Moor’s approach involves the application of an ‘impartiality test’ and then an appraisal of the various outcomes and consequences of actions and policies, weighing the good consequences and the bad consequences, of those policies surviving the impartiality test.

The paper will argue that both the future prospects indicated by the reports discussed above and Moor’s just consequentialism are premised on what Bruno Latour (2004) has called the “old” modern Constitution. This he opposes to the “new” Constitution of political ecology. He encapsulates this in a conception of a political ecology which shifts from ‘matters of fact’ to ‘matters of concern’ where we move ‘?from certainty about the production of risk-free objects (with their clear separation between things and people) to uncertainty about the relations whose unintended consequences threaten to disrupt all orderings, all plans, all impacts?An infinitesimal cause can have vast effects; an insignificant actor becomes central; an immense cataclysm disappears as if by magic; a miracle product turns out to have nefarious consequences; a monstrous being is tamed without difficulty’ (Latour, 2004, p.25). The problem for Moor’s theory is to provide at one and the same time both ethical and epistemological justifications for policy. But our moral judgements are fallible in much the same way as our empirical judgements might be. There may be morally relevant facts which will falsify our decisions. As Wood, Jones and Geldart (2004, p.51) suggest ‘forecasting people’s needs and values 20 years or more into the future is fraught with uncertainty’. Indeed it is! The paper, therefore, will argue for a fallibilist approach to the ‘policy vacuums’ arising from nanoscience and nanotechnology. However, this moral-fallibilist approach will be in the context of Latour’s (2004) “new” Constitution of political ecology.


ARNALL, A.H., (2003). Future Technologies, today’s choices: nanotechnology, artificial intelligence and robotics; a technical, political and institutional map of emerging technologies. London: Greenpeace Environmental Trust.

LATOUR, B. (2004) Politics of nature: how to bring the sciences into democracy. Cambridge, Mass.: Harvard University Press.

MOOR, J.H., (1999). Just consequentialism and computing. Ethics and information technology. 1 (1), pp.65 – 69.

TAVANI, H., (2004). Ethics and technology: Ethical issues in an age of Information and Communication Technology. Hoboken, NJ: Wiley.

THE ROYAL SOCIETY AND THE ROYAL ACADEMY OF ENGINEERING, (2004). Nanoscience and nanotechnologies: opportunities and uncertainties. London: The Royal Society and The Royal Academy of Engineering.

WOOD, S., JONES, R. and GELDART, A. (2003). The social and economic challenges of nanotechnology. Swindon: Economic and Social Research Council.