Unintended Consequences: Computerising the UK’s Social Fund

AUTHOR

Andy Bissett,
School of Computing & Management Sciences,
Sheffield Hallam University,
England
S1 1WB

ABSTRACT

The UK Government has placed a great emphasis upon streamlining government processes and making government more open by the use of information technology. In fact, the UK Prime Minister has stated that all 457 individual government services are to be delivered electronically by 2005, two years behind a similar commitment by the US government (Cross, 2000). No doubt behind the rhetoric of modernisation and transparency there also lies a more traditional concern to save money. However, this far-reaching strategy is having effects beyond the explicit and implicit aims. Beyond the ‘simple’ technical causes of project failure (Hinde, 2000) it is possible to identify emergent effects that were unintended. The case of the UK’s Social Fund is presented in this work in detail, as the ethical consequences of the project to computerise its assessment procedures are immediate.

The UK’s Social Fund is administered by the Government’s Benefit Agency. Its purpose is to provide discretionary, interest free loans of up to œ1000 to the poorest families to assist in the purchase of necessities such as shoes, clothing, cookers and beds. Before the system was computerised in April 1999, approximately 11,000 applications per year were rejected by human assessors. The number of rejections following computerisation has soared to 362,000 (Hartley-Brewer, 2000). It seems that disabled people and those with special needs have been especially affected by the new automated system (CAB, 2000).

The annual government report for the Social Fund notes that the trend to reject applications was rising before computerisation (BA, 2000). However, the enormous increase appears to be an unintended consequence of automating the assessment system which previously allowed human discretion some room for manoeuvre. A Government spokesperson defended the new system, saying that it had helped to end “intrusive and paternalistic” questioning by Benefits Agency staff (Hartley-Brewer, 2000). The Government’s position is that total loans to families in poverty rose by 15% from œ344m in 19989-99 to œ396m in 1999-2000. The same statement claimed that the number of rejections had risen in line with the number of applications (Hartley-Brewer, 2000).

In this work we trace the recent history of the Social Fund and of the steps taken to automate the assessment procedure. We follow the dialectical interplay between this initiative and the various stakeholder reactions. An attempt is made to disentangle the figures and trends from political presentation and ‘spin’, and to explain what has been happening.

We hypothesise that the case of the UK’s Social Fund primarily represents an instance of technological determinism (Davies, 1996; Davies, 1997). The somewhat predictable phenomenon of technological determinism can give rise to locally unpredictable consequences. We draw out the ethical implications of technological determinism and present some advice and guidelines for the future avoidance of unintended consequences. We propose that, by analogy with safety-critical computer systems, an ‘ethical audit’ for such ‘welfare-critical’ systems should be a standard part of the case for such systems at inception. In particular, ethical dimensions and effects must be considered at project inception time, parameterised, and baselined, in order to avoid harmful unintended consequences. This is consistent with, and builds upon, the concept of ethical issues in IT systems being considered at design time (Feng, 1998).

As a secondary dimension, we investigate the difficulty that organisations such as governments often have in recognising and learning from such errors. ‘… designed error and its cover-up are the foundations for producing unethical behaviours in ways that seem reasonable, if not necessary’ (Argyris, 1990, pxiii). Using Argyris’ concept of learning organisations, we try to generalise from the case of the UK Social Fund so that more widely applicable ethical lessons may be learned.

REFERENCES

  • Argyris, C. (1990) Overcoming Organisational Defenses, Englewood Cliffs NJ: Prentice Hall.
  • CAB (2000) Computerising the Social Fund, Citizens’ Advice Bureau, London, England.
  • Cross, M. (2000) Lives online, The Guardian (Society Section), 15th November, 2000, 2-3.
  • Davies, P. (1996) Technology: the missing factor in understanding the relationship between culture and business ethics theory, in Barroso, P., Ward BynumT., Rogerson S., and Joyanes L. (eds.) Proceedings ETHICOMP 96, Vol. 1, Complutense University of Madrid, 122-140.
  • Davies, P. (1997) Technology and business ethics theory, Business Ethics: A European Review, April (6), 2, 76-80
  • BA (2000), Benefits Agency Annual Report and Account for the Social Fund 1999-2000. House of Commons Paper HC618. London: Stationery Office.
  • Feng, P. (1998) rethinking technology, revitalising ethics: overcoming barriers to ethical design, in J. van den Hoven, S. Rogerson, T. Ward Bynum, D. Gotterbarn (eds.) Proceedings ETHICOMP 98, Erasmus University Rotterdam, March 1998. 97-108
  • Hartley-Brewer, J. (2000), Benefit loans refusals rocket to 362,000, The Guardian 15th August, 2000, 10.
  • Hinde, S. (2000) New millennium, old failures, Computers and Security, 19 (2) 119-127.