Regulating CCTV

AUTHOR
Andrew A. Adams

ABSTRACT

As is well known, the UK has more CCTV cameras per head than any other country in the world. The majority of the cameras in operation are analogue cameras attached to a nearby visual display, although a significant minority are networked to larger local monitoring centres. A small number of wide networks with broad access exist. One of these is the national traffic flow system. In his 2005-6 Annual Report the UK’s Chief Surveillance Commissioner raised the question of the attachment of ANPR (Automatic Number Plate Recognition) systems being attached to this network and being used to routinely track the movement of vehicles. He raised the concern over whether this was consistent with human rights legislation, and also the security of the data being gathered.

As cheap digital cameras are rolled out to augment or replace existing analogue installations, as network access to camera installations becomes widespread, and as automated processing gains capabilities, how should the deployment and use of such systems be regulated? Until 2005 all “identifiable” images of a particular individual were regarded under UK law as “personal data” as defined by the Data Protection Act 1998. Following the case of Durant vs FSA, the advice on when a video sequence constitutes “personal data” changed radically. As the capabilities for tracking individuals’ movements through widespread camera networks increases, the question of how to regulate the use, misuse and abuse of CCTV installations becomes ever more pressing.

Is a CCTV act needed, and if so what should its provisions be? In this paper we will propose appropriate regulatory apparatus and principles to regulate the deployment of CCTV systems and the use of automated processing systems attached to networks of cameras. Issues addressed include:

Defining the nature of video sequences as “Personal Data”. At what resolution and orientation do individuals become recognisable within the field of surveillance, is it possible to track movement between camera viewpoints, and if PTZ (Pan/Tilt/Zoom) cameras are in use then how are their viewpoints controlled?

Acknowledging the link between “identity”, “identification” and “surveillance”. Identification can be within a sequence or surveillance area (this figure seen here is the same as this figure seen here) or tied in to external identifiers including both databases and human identification.

Setting limits on who may deploy video surveillance equipment in public and semi-public spaces. Surveillance activity should be “notifiable” along the same lines as processing of personal data. Notification of systems beyond one or two cameras in a corner shop should be notifiable to the Surveillance Commissioners and appropriate principles of surveillance should be published and enforced, similar to the Data Protection Principles.

Setting statutory requirements for notification of CCTV monitoring, recording and automated processing of images, to those thus surveilled. Rights to prevent abuse of surveillance depend on awareness of surveillance. The routine recording and processing of video data and its possible insecure transmission and storage are all issues which require public acknowledgement and scrutiny if we are to avoid widespread abuse which cannot easily be solved “after the fact”.

Setting “necessity and proportionality” principles in law for the routine automated processing of video surveillance. The attachment of ANPR to traffic flow monitoring was undertaken in a systematic manner by UK police forces without any public consultation and without higher level social/political authorisation. How much recording/processing of data is necessary and proportionate to the benefits society gains from such surveillance, and who benefits and who loses in the process?

Requiring appropriate security for the primary video feed and stored sequence data to prevent unauthorised processing. Fiction presents CCTV networks as very easy to crack into with a modicum of expertise. As automated processing becomes cheaper and more sophisticated, the “value” of video feeds becomes greater and thus the security of such systems requires more attention.

Defining “best practice” in the training and supervision of those with access to sequences. As shown by the Sefton CCTV operators case and by ethnographic studies by Norris and Gould (et al), misuse of public CCTV cameras for voyeuristic purposes and otherwise does happen. The Data protection Act requires Data Controllers to ensure that those with access to data should be properly trained not only in how to access such data but what the limits are on whether they should access particular information. Similar measures should apply to CCTV surveillance operators.

Defining “best practice” in the deployment of PETs (Privacy Enhancing Technologies). PETs exist and are being improved alongside other forms of automated CCTV processing. For example, it is possible to block some areas of view from the video feed out of a camera entirely or to encrypt certain portions of the feed to that only authorised access can be achieved. Alongside necessity and proportionality rules for deploying surveillance capabilities, appropriate use of such PETs should be mandated.