AUTHOR
Richard S. Rosenberg
ABSTRACT
Much of the motivation for filtering and blocking programs arises from the efforts in the U. S. to defeat the Communications Decency Act of 1996 (CDA) by showing that programs exist, or would soon exist, to control access at the local level, removing the need to place the burden on Internet Service Providers (ISPs). In some sense, this was a bargain made with the devil because those opposed to the CDA expected that filtering programs would largely be used in the privacy of one’s home, not in public institutions such as libraries, schools, and community centres. This latter use, imposes restrictions on the general public (library patrons) that do not apply to families that choose to purchase and use such programs in their homes. In public places, they violate individual choice by substituting software whose criteria of access are largely a mystery and are subject to a number of pressure groups with their own agendas.
If filtering programs are to be employed, and the current discussion in the U. S. Congress strongly suggests that their use will be made mandatory, if federal funds are to be allocated to pay for Internet connections, then at the very least, the filtering criteria – keywords, local or remote lists – must be accessible to library patrons; otherwise, the process is simply a form of censorship. Since libraries are the only source of Internet access for many people, they should not limit that access by arbitrary means. Although the debate is informed by both legal and ethical positions, the focus on this paper will be on the ethical side. That is librarians, as professionals, must make decisions with respect to access policies, limited budgets, collections policies, and the rights of the borrowers, to say nothing of community standards in general.
As professionals dedicated to upholding open inquiry and the right to access available information sources, librarians face a number of difficult challenges with respect to the use of filtering programs. It is not surprising therefore that the professional associations in Canada and the US have adopted very strong stances against the use of filters. Nevertheless, in some libraries, community pressure has resulted in a variety of strategies to ^Lprotect children9 while leaving adults free to pursue their interests. For example, one or more computers in the children9s section may have a filtering program installed but none of the ones in the adult areas are similarly compromised. Computers in the adult areas may be shielded from casual view by surrounding them with portable walls or that dreaded tap on the shoulder may be employed to remind patrons that they should respect the sensibilities of others. All of these strategies are hotly debated in the library community.
But what of the growing number of filtering programs and blocking programs based on criteria that may not be readily available or even understandable.? Key words, lists of blocked sites, levels or degrees of violence or sex to characterize Web sites, all present problems to professionals who are concerned with helping people answer their questions. And why should librarians be responsible for children whose parents have dropped them off in the supposedly safe confines of a library while they go about their shopping? All of these issues are distinct from a number of legal and political ones, such as possible government requirements that to obtain federal funds for accessing the Internet, schools, libraries, and community centres, may have to install filtering programs or that local politicians may initiate similar requirements to win the favour of their conservative constituents. I will argue that filters are no substitute for responsible parents and if conservatives wish to keep government out of their lives, they should assume a more involved role in the Internet activities of their children.
In what follows, we will describe filtering programs and some examples of known problems associated with their use, both in Canada and the U. S. Blocking programs based on self-rating mechanisms present other, someone more subtle difficulties, and concern with their use and the seductive notion of self-rating in general will be discussed as well. The current debate will be characterized and criticized, with examples drawn from both the US and Canada. Wider ramifications will also be examined as the European Community is also considering the use of filters to tame the world of the Internet. Finally, some conclusions and recommendations will be offered in the light of the arguments presented.