Computing & Privacy

A monograph on the right to privacy from the National Conference on Computing and Values (NCCV), held on the campus of Southern Connecticut State University in August 1991.

Table of contents of Computing & Privacy

Editor’s Introduction

The National Conference on Computing and Values (NCCV) was held on the campus of Southern Connecticut State University in August 1991. The Conference included six “tracks”: Teaching Computing and Human Values, Computer Privacy and Confidentiality, Computer Security and Crime, Ownership of Software and Intellectual Property, Equity and Access to Computing Resourcesand Policy Issues in the Campus Computing Environment. Each track included a major address, three to five commentaries, some small “working groups,” and a packet of relevant readings (the “Track Pack”). A variety of supplemental “enrichment events” were also included.

This monograph contains the proceedings of the “Computer Privacy and Confidentiality” track of NCCV. It includes one background reading, the “track address” with two commentaries, the conference bibliography, and a report on the activities and findings of the small working group on privacy and confidentiality. In addition, there is a separate section containing two “enrichment papers” on professional codes of ethics. The background reading is “Three ‘Levels’ of Computer Ethics” by Terrell Ward Bynum. The track address is “Contemporary Privacy Issues” by Willis Ware. The commentaries include, “Information as a Commodity: Control and Benefit are Morally Owed to the Source” by Richard A. Wright and “Comments on Willis Ware’s ‘Contemporary Privacy Issues’” by Loftus Becker. Jacque Catudal was the “Track Coordinator” for this track, and the Appendix at the end is his report on the activities and findings of the small working group of the track.

The enrichment papers form a special section on professional codes of ethics, which covers a broader range of topics than simply privacy. These papers include “A Rationale for the Proposed Revision of the Association for Computing Machinery’s Code of Professional Conduct” by Ronald E. Anderson and “Facing the Computer Ethics Dilemma” by C. Dianne Martin and David H. Martin.

The National Conference on Computing and Values was a major undertaking that required significant help from many people. The Editors would like to express sincere thanks to the National Science Foundation and the Metaphilosophy Foundation for support that made the project possible. And we wish to thank the following people for their invaluable help and support: (in alphabetic order) Denice Botto, William Bowersox, Aline W. Bynum, Robert Corda, Donald Duman, Richard Fabish, James Fullmer, Ken W. Gatzke, Steven J. Gold, Edward Hoffman, Rodney Lane, Sheila Magnotti, Armen Marsoobian, John Mattia, P. Krishna Mohan, Beryl Normand, Robert O’Brien, Daniel Ort, Anthony Pinciaro, Amy Rubin, Brian Russer, Elizabeth L.B. Sabatino, Charlene Senical, J. Philip Smith, Ray Sparks, Larry Tortice, Suzanne Tucker.

Three “Levels” of Computer Ethics

Technology and Human Values

Ideally, new technology always advances, enhances and supports human values. But of course this is not an ideal world. The effects of technology are mixed. For example, the “agricultural revolution” and the “industrial revolution” brought many benefits to human beings: food for the hungry, effective medical care for the sick, relief from heavy labor, rapid and comfortable transportation, and so on. Nevertheless, problems were generated: overpopulation, world-threatening weapons, pollution, terrible accidents which killed many people, etc.

Too often, new technology develops with little attention to its impact upon human values. The mass production of automobiles, for example, had profound effects upon cities, travel, entertainment, nature, the environment, even sexual mores. Many of the consequences were unforeseen – even unimagined – by those who created the technology.

Let us do better! In particular, let us do what we can in this era of “the computer revolution” to see that computer technology advances human values.

True enough, we could argue endlessly over the meanings of terms like “privacy,” “health,” “security,” “fairness,” or “ownership.” Philosophers do it all the time – and ought to. But people understand such values well enough to desire and even to treasure them. We do not need absolute clarity or unattainable unanimity before we do anything to advance them.

What is Computer Ethics?

How can we work to make computing technology advance human values? One way is to teach “computer ethics” to the public at large and to our students enrolled in courses in computing and information sciences. But what is computer ethics?

The term “computer ethics” was coined in the mid 1970s by Walter Maner to refer to that field of applied professional ethics dealing with ethical problems aggravated, transformed or created by computer technology. By analogy with the more developed field of medical ethics, Maner focussed attention upon applications of ethical theories and decision procedures used by philosophers doing applied ethics. He distinguished “computer ethics” from sociology of computing and from technology assessment.

For nearly two decades, the term “computer ethics” kept this focussed meaning. Recently, however, the term “computer ethics” has acquired a broader sense that includes applied ethics, sociology of computing, technology assessment, computer law, and related fields. This broader kind of computer ethics examines the impact of computing and information technology upon human values, using concepts, theories and procedures from philosophy, sociology, law, psychology, and so on. Practitioners of the broader computer ethics – whether they are philosophers, computer scientists, social scientists, public policy makers, or whatever – all have the same goal:

To integrate computing technology and human values in such a way that the technology advances and protects human values, rather than doing damage to them.

Donn Parker pursues this goal by gathering example cases and presenting scenarios for discussion. Judith Perrolle does it by applying sociological theories and tools to data about computing; Sherry Turkle does it by applying psychological theories and tools; James Moor, Deborah Johnson and others do it by applying philosophical theories and tools; and so on.

All of these thinkers and many others address problems about computing technology and human values, seeking to

  1. Understand the impact of computing technology upon human values
  2. Minimize the damage that such technology can do to human values, and
  3. Identify ways to use computer technology to advance human values.

Three “Levels” of Computer Ethics

Computer ethics questions can be raised and studied at various “levels.” And each level is vital to the overall goal of protecting and advancing human values. On the most basic level, computer ethics tries to

sensitize people to the fact that computer technology has social and ethical consequences.


This is the overall goal of what some call “pop” computer ethics. Newspapers, magazines and TV news programs have engaged increasingly in computer ethics of this sort. Every week, there are news stories about computer viruses, or software ownership law suits, or computer-aided bank robbery, or harmful computer malfunctions, or computerized weapons, etc. As the social impact of information technology grows, such articles will proliferate. That’s good! The public at large should be sensitized to the fact that computer technology can threaten human values as well as advance them.

The second “level” of computer ethics can be called “para” computer ethics. Someone who takes a special interest in computer ethics cases, collects examples, clarifies them, looks for similarities and differences, reads related works, attends relevant events, and so on, is learning “para” computer ethics. (I’ve borrowed this term from Keith Miller, who is the first person I ever heard use it.) By analogy with a para medic – who is not a physician, but who does have some technical medical knowledge – a “para” computer ethicist is not a professional ethicist, but does have some relevant special knowledge. A para medic, of course, cannot do all that a physician does, but he or she can make preliminary medical assessments, administer first aid and provide rudimentary medical assistance. Similarly, a “para” computer ethicist does not attempt to apply the tools and procedures of a professional philosopher or lawyer or social scientist. Rather, he or she makes preliminary assessments and identifications of computer ethics cases, compares them with others, suggests possible analyses.

The third level of computer ethics I call “theoretical” computer ethics, because it applies scholarly theories to computer ethics cases and concepts. Someone proficient in “theoretical” computer ethics would be able not only to identify, clarify, compare and contrast computer ethics cases; she or he could also apply theories and tools from philosophy, social science or law in order to deepen our understanding of the issues. Such “theoretical” computer ethics is normally taught in college-level courses with titles like “Computer Ethics,” “Computers and Society,” “Computers and the Law.”

All three “levels of analysis” are important to the goal of advancing and defending human values. Voters and the public at large, for example, should be sensitive to the social and ethical consequences of information technology. Computer professionals and public policy makers should have “para” computer ethics skills and knowledge in order to do their jobs effectively. And scholars must continue to deepen our understanding of the social and ethical impact of computing by engaging in theoretical analysis and research.

In reality, of course, none of these three “levels” of computer ethics is cleanly separated from the others. One blends gradually into the next. Nevertheless, I think it is useful to distinguish them when considering computer ethics in the broadest sense.

Southern Connecticut State University

Contemporary Privacy Issues

1.0 Introduction

This paper broadly discusses the personal privacy issue as it initially developed and how it has changed as a result of the information industry. It makes several suggestions for attempting to deal with the contemporary privacy question as a significant public policy issue but takes no position on the “right answer.”

By way of clarification, this paper does not address privacy issues that are constitutionally based (e.g., the right to make certain decisions about oneself independently of the state – abortion); nor does it address the ones that derive from tort law (e.g., right of a prisoner not to be under constant surveillance); nor does it address other forms of privacy such as the physical invasion of one’s space or the psychological invasion of one’s emotional sphere.

Neither does it address computer security which, as a management and technical matter, is closely connected with privacy, but its principal thrust is to the protection of data and to assurance of certain aspects of it (e.g., confidentiality, integrity).

This paper concentrates solely on the dimension of privacy that is related to the use of information about people. It has been in the past called “personal privacy” or “information privacy,” but the issue is larger now than either of the terms has traditionally implied. Today, informational or personal privacy must be considered as a much broader issue than it was throughout the 1970s. The existence, influence, and activities of an information industry has created an entirely new dimension to the issue.

2.0 Historical Development

First, briefly review the development of privacy concepts and privacy law as it happened in this country. As history unfolds, it will become apparent that the United States has chosen a piecemeal approach in the form of individually targeted laws as opposed to the comprehensive federal-level approach that Europe has preferred. In this country, we have passed individual laws aimed at specific industries or problems. Typically the European country has created a data protection board and a data commissioner; together they license and control all database activities in the country.

As a conceptual topic, the first mention of privacy as it relates to computer-based data systems seems to have been in a 1965 paper addressing the impact of computer technology on communications and people and written by Paul Baran of The RAND Corporation (Baran, 1965). His closing paragraph contains the statement: “It may seem a paradox, but an open society dictates a right-of-privacy among its members, and we [computer professionals] will have thrust upon us much of the responsibility of preserving this right.” At about the same time, Alan Westin at Columbia University began his famous study of computer databanks under National Academy of Sciences sponsorship. It was published as “Databanks in a Free Society” (Westin and Baker, 1972).

At the Federal level, privacy law commenced with the Fair Credit Reporting Act of 1970. The credit reporting industry had been misbehaving and Congress had received so many complaints that it finally did something. The FCRA has been generally unchanged since then, although Congress plans to hold hearings on its revision. It is this law that lets the citizen see his record and cause errors to be corrected; and should there be a credit denial decision made about someone, the FCRA gives cost-free access to credit reports that were part of the decision.

Concurrently in the early 1970s, Congress had also started to talk about the use of the Social Security Number as a universal personal identifier. Secretary Elliot Richardson of the [then] Department of Health, Education and Welfare (DHEW) became concerned about all the personal information that the DHEW held not only in the Social Security Administration but also elsewhere. He impaneled a committee to look at the situation and to make recommendations for his action. The author was fortunate to be its chair.

The committee report, the well known “Records, Computers, and the Rights of Citizens” (Reference 3, 1973), introduced the concept of a Code of Fair Information Practices and outlined the content of such a code. The committee of course knew about the FCRA and its provisions, but the committee formulated a set of protective measures that it believed appropriate behavior for any record-keeper of personal information. Subsequently, the name “Code of Fair Information Practices” was conceived during an impromptu after-hours discussion by the group’s leadership. Not everything found its way into the Code as we now know it. For example, at one time there was a proposal on the table that would have required every access to a personal record for whatever purpose to be reported to the data subject.

The report became the intellectual basis of the Federal Privacy Act of 1974, signed by President Ford on 31 December 1974. Parenthetically, the Act was signed on the last day of the year because the president had gone to Colorado for the holidays and the bill (as the story is told) was flown to him for signature.

In addition to outlining the required behavior for all Federal agencies that hold personal information, the Act also created the Privacy Protection Study Commission (PPSC). The latter was a group of seven individuals appointed by the President and by Congress, was supported by a staff varying between 20 and 40 plus an equal number of consultants, and functioned for two years and a few months for a total budget of just over $2M. The author was again fortunate to have been appointed a Commissioner and also to serve as vice chairman of the activity. Many of the staff members have continued an interest in privacy as a social question.

There was an interesting near miss in the Privacy Act. By the time the United States addressed personal privacy, Sweden had already passed the world’s first privacy law and had created a data privacy board with wide powers. This model appealed to some people and, at one point, the draft law did indeed call for the formation of a Federal Privacy Board. There was much opposition on the grounds that the impact on private industry would be extreme and the behavior of the private sector was not well understood anyway. The compromise outcome was creation of the PPSC.

The Commission presented its main report and five appendices, “Personal Privacy in an Information Society,” to President Carter in mid-1977. In terms of value for money, the group of six reports was a best buy for the country. (Reference 4)

While the PPSC examined record-keeping practices in a number of industries and devoted a chapter in its reports to each, no Federal law eventuated. There have been voluntary adoptions of Fair Code practices, and some industries have developed a model privacy policy for voluntary adoption by their members. The primary driver in such actions was avoidance of new law and of government intrusion into the affairs of private industry. The Carter administration did not act promptly on the PPSC report; and by the time it had developed a position, time had run out. Subsequently, the eight years of the Reagan administration were ones of total indifference to privacy; and so far, the current administration has taken no action either.

The point of the brief history is to underscore the observation that the privacy movement at the Federal level commenced with a concern about a specific industry in the private sector – the credit reporting industry – but then with the passage of the 1974 Privacy Act moved largely to concerns about institutions of the public sector.

There were a few other specific laws during the 1970s. One, the Fair Credit Billing Act, gives the individual standing to contest mistakes in his bank card and other credit card accounts. The Family Educational Rights and Privacy Act relates to the ability to see one’s educational records. Various states passed laws in the image of the Federal laws, but sometimes more stringent and sometimes more extensive (for example, providing access to one’s personnel records). California put an amendment in its constitution saying simply that every resident of the state shall have an expectation of privacy.

At present, there are about 20 Federal laws dealing with various aspects of privacy and nearly every state has at least a few privacy laws. Almost every state has laws on medical records, wiretaps, use of polygraphs, and computer crime. The next most frequent law is on arrest records (Reference 5).

3.0 United States Posture

So the picture in the United States has been that of a little attention here, a little there, a little at the Federal level, a little in many states. There has been no comprehensive overarching law as in European countries; in fact, some people in other countries hold that the U.S. behavior is an embarrassment to the world’s interest in privacy protection.

Keep in mind the general computing environment scene of the time; it was that of the stand-alone system, not networked, and generally batch accessed. Given such a state-of-art, the DHEW committee did not really address the consequences of fast moving computer and communications technology. It remains to be determined whether a Fair Code, appropriate to the circumstances of the time, is an adequate or proper approach in a highly networked situation, particularly when the networking is between different business organizations, often on a dial-up transaction basis, and sometimes only for sporadic occasional data exchanges. A current example of today’s practice is the point-of-sale system linking with a check-verification or bank-card system for only the duration of a single transaction.

4.0 Source of the Problem

Throughout the 1970s, “the problem” was generally seen to be with government which at the time was by far the biggest holder of personal information. While there may today in industry be databases that rival the size of the largest government ones, the Federal establishment certainly still has the biggest collection of large databases within one organizational structure. It is safe to say that the Federal government is still the biggest record keeper in the country. In addition, many agencies of government have broad powers of enforcement which add to the image of the Federal government as the problem.

By way of clarification, note the relationship between privacy and security. The personal privacy issue is an information-use issue, only secondarily an information-protection issue. If use of information is to be controlled, then access to it must be controlled; this is a basic tenet of computer security.

In occasional instances the protection issue is more direct. Sometimes personal information has been declared confidential or private by law; for example, census data from the very beginning, taxpayer data starting in 1976, and certain data relevant to law enforcement and the justice system. By implication, such categories of legally identified data require protection, which however is usually not defined in detail. From a computer security standpoint, one does not know whether protection implies restriction of access and/or restricted usage and/or proof against unauthorized change and/or timely updating and/or what. Such dimensions are left to the holding agency to decide.

In contrast to only a few instances of legally prescribed protection, the Privacy Act applies to all agencies of government and with exceptions for a few categories of data such as classified and intelligence data, it seeks to control the usage of personal information.

5.0 Privacy as a Public Policy Issue

While it has never been stated as a public policy, the thrust of privacy has been to tacitly acknowledge that it is legitimate to use personal information so long as the data subject knows of the use in advance and the use treats the data-subject fairly and, by extension, is not abusive. Hence, in the Privacy Act there is a provision for announcing in the Federal Register what are called “routine uses” of specified personal information. “Fairly” is defined implicitly through the provisions of the Code; e.g., the individual has the right to see the record, to challenge its contents, to cause errors to be corrected.

There is one final observation to complete the characterization of privacy and its embedding during the 1970s. Use of personal information within government was generally to adjudicate entitlements, rights, benefits, and privileges. Thus, it was even more natural for the prevailing view to be that “government is the rascal; government is the place to watch for privacy problems.” In the 1970s, moreover, usage of personal information was generally confined to looking up the record; there was not much commingling of data from many sources, nor elaborate processing. Computer matching had not come into vogue. It was talked about but generally shouted down on the basis that records were too full of errors to match entries from different databases. There probably was a small amount of hand matching as there undoubtedly always has been.

Other than credit reporting, there was little or no “information industry” in the early 1970s; there was no organized industry whose commodity-in-trade was personal information and whose economic viability depended on exploiting such information for profitable gain.

6.0 Contemporary Privacy

Now ask: What has changed since the 1970s? What is really different about privacy issues today?

The answer:

A thriving industry dealing in personal information has emerged and it is quite different in nature from what existed in the 1970s.

In the large, it emerged from the same companies that pioneered credit-data reporting, but there are new players also. The original credit-data companies had the data; they knew how to manipulate and process it; they had the big computer systems in place; they had the means for distribution; they had an established customer base. They saw the revenue stream and the potential profits from vastly larger activities in the selling of personal data, and new technology supported anything they wanted to do.

Computer and communications technology has driven the growth of the information industry. Information that once had to be laboriously assembled by hand or punched-card methods, can be bought in machine form. For instance, the Departments of Motor Vehicles in the states are generally glad to sell lists of registered drivers and car ownership. Corresponding lists of boat and aircraft owners are equally easy to get. Property records have been computerized and are, by law, public records. A lot of other things, including some related to justice and law enforcement, also fall in the class of public records. Census tract data can be purchased on tapes, on floppys and disk packs. A whole industry thrives on assembling and selling data.

The telephone companies, liberated by the AT&T breakup, generate additional revenue through sales of phone lists, sorted and indexed in various ways. The United States Postal Service can provide a database of zip codes vs. street address. Even census data, anonymous as it is, can be folded in on the basis of geographical location. Lists of contributors can be obtained; lists of subscribers to magazines or technical journals can be bought. There are on-line subscriber databases where some information is readily available for a fee, even to individuals.

Business is booming in personal information; and in the large without legal or regulatory oversight; and without a stated public policy to guide or oversee it.

In addition to legitimate and legal sources, all sorts of leakage opportunities exist for other kinds of information to migrate from possibly protected environments into general databases. Facts (for example, one’s Social Security Number) have a way of getting around; and with extensive and rapid interchange of data among record keepers, nothing is secret very long.

Nominally, lists are thought of for mailing purposes, but indeed they can have quite different uses. One of the more troublesome is the completeness of the dossier that can be developed from factual information, either as it exists directly or as it can be inferred or extrapolated from partial data.

6.1 Current Example

There is currently a group of six lawsuits against TRW Credit Data for allegedly providing inaccurate or inappropriate data to its clients. One assertion is that it reports bill-payment practices; how can it do that? Easily and simply (Reference 6).

Recall the incident of Lotus Marketplace: Households. Using data supplied by Equifax from its own files and combined with other sources, Lotus Development Corporation proposed to market a CD-ROM database of 120 million individuals containing a wide assortment of facts about each: name, address, gender, age, marital status, dwelling type, neighborhood income, neighborhood lifestyles, buying propensity – overall, a quite good dossier on half the people in the country (Reference 7).

Each credit card issuer makes a monthly report to TRW of the account status. There is no problem at all inferring one’s clearly evident payment record on such accounts. Given the kind of data assembly that Lotus and Equifax did, it is clear that statistical relationships can be established between levels of income, how it is used, and credit obligations. Combine this with all known facts from the database and by inference bill paying habits – the answer – falls out. Of course, it can be wrong in individual cases, and to the extent that such derived data is used to make credit and other decisions about people, then some fraction of the country’s population is being harmed by use of incorrect data – data that it does not see, probably cannot have access to, and hence has no way of correcting or challenging.

7.0 Public Policy Again

Look what has happened. In privacy of old, there was an implicit public policy that the government could use personal information so long as the data subject knew it and so long as there was no abuse or discrimination. Private industry operated in the shadow of the government’s unstated policy.

Now, it is not abuse that is the whole issue; it is also use per se, perhaps even use without consideration of abuse. Typically the data subject does not, probably cannot, know of specific uses except inferentially or implicitly through consequence. Some organizations offer a so-called “opt-out” feature whereby one’s name is assertedly removed from a list before it is sold. But it would be a full-time job for anyone to stay off all the lists that exist. Among other things, every transaction with a mail-order firm puts a name back into circulation. There is no such thing as permanently staying off lists.

We knew where to look for the abuse problems: either the government or a few industries that had been targeted by law. It has become questionable whether the government is still the prime concern in the game of privacy.

Assuredly, the practices of government have changed somewhat, most notably the explicit and sanctioned matching of computer records. Even there, though, we have a law – which amounts to a privacy law – for the proper conduct of such events. Checks and balances are in place, perhaps not perfect, but at least there is some monitoring and oversight.

The private sector practices have broadened and matured extensively and generally without oversight, public outcry, or even visible publicity except in a very few cases. Checks and balances are not explicitly in place; there is no law. The private sector is increasingly exploiting personal information without regard to, or awareness of, broader implications or consequences. The industry trafficking in information about people is constrained only by its own sense of ethics and morality; sometimes, perhaps often, the lure of new revenue or good profits will outweigh the cautions and socially responsible judgment. Companies are generally doing as they wish, constrained only by their own sense of propriety and social obligation, and by the risk of offending the social conscience. If there has been no explicit public policy guiding the government use of personal information, even less so is there any for private sector use.

7.1 An Illustration – CNI

As an illustration, take the current debate over calling number identification (CNI, also called ANI) that the telephone companies are beginning to offer. Without getting into the pros and cons of CNI as a desirable adjunct to telephone service, consider how it is being marketed. The traditional “buyer beware” context in which we all have learned to function implies: (1) that we have alternate choices – competitive products or services, and (2) we are under no compulsion to buy.

Marketing of the CNI service violates both those premises. At the local telephone level, we do not have competitive choices; and many telephone companies do, or are proposing to, offer the service in a biased manner; namely, a subscriber’s phone number will be forwarded to the called party every time unless the calling subscriber pays an additional fee or takes additional actions during calling to avoid the CNI action. Every subscriber is being forced to become part of a game with which he may wish not to be involved.

But staying out of the game involves either an additional fee to the subscriber – as obtaining an unlisted number has traditionally imposed an extra monthly fee – or additional key touches or dialed numbers to turn off the process. One way, the disinterested subscriber pays money; the other way, he pays in time to make the additional dialing actions.

Normally one pays for something if he wants it. The telephone companies have inverted the principle; they are asking you pay for it if you do not want it.

Whatever one thinks about the merit of CNI and its present implementation, the telephone companies have implicitly and without public debate made a major public policy declaration. In effect, policy has been established that the importance of calling number identification is so important to society overall that every telephone subscriber must participate in it, unless he is willing to pay extra in some fashion to avoid it. The phone companies are not only marketing CNI in a socially biased fashion, they have intruded on the long-standing tenets of a capitalist society; “buyer beware” no longer works. And they have done this from the position of monopolistic power.

Just to put this in perspective, marketing of call-forwarding is neutral; there is no compulsion. If one wants it, pay for it; if not, do not pay for it; but no one is finessed into being involved with it against his wishes.

8.0 The Broadened Public Issue

Are there other examples of such implicitly made public policy? Yes, indeed. The most obvious one underlies the entire discussion of contemporary privacy. The present information industry has made it de facto public policy that use of personal information without pre-knowledge of the data subject, without visibility to the data subject, and totally outside his control is appropriate behavior for profit-making industry. It is a complete negation of the Fair Code of Information Practices except in the few instances for which law happens to have been passed.

Without doubt, it is time to realize that the private sector has become a very prominent “opponent and potential source of problems” in terms of how personal information is used, what effect it has on people, how widely it is shared, and the implicit public policy that is being made. We surely now must put the private sector and the government on equal terms so far as utilization and exploitation of personal information is concerned, each with potential for abuse and misuse, and each with different effects on public policy and social mores.

9.0 Possible Approaches to Protection

What are possible mechanisms for dealing with the new dimensions of privacy?

1. The Fair Code, perhaps extended or elaborated in some way, would be an approach but as presently embodied in law, it can only compel a specified behavior by Federal Agency. It remains to be seen whether such a Code can be effective in an environment of private-sector network connectivity.
2. Specific targeted law which is what we have done to date. It will always be after the fact of necessity, and the response time by lawmakers may not be very rapid.
3. Give an individual standing to sue if he can show harm. The catch here will be in defining harm and in establishing its reality. Showing harm in a court of law is tedious, difficult, and in the context of privacy, may be impossible. Certainly a lot of case law and therefore time would be required to give it a good foundation.
4. Give an individual standing to compel a prescribed-by-law set of behaviors by the record-keeping organization. This is probably easier to administer than a standing to sue for harm, but working out an acceptable “Code of Fair Behavior” that would cover much, or most, of the information industry would be tricky. On the other hand, one might be able to deal with organized industries one by one, and tailor a Fair Code to each.
5. Put in place more formalized watchdog organizations. Many states have consumer rights offices and perhaps they can be bolstered. With the interstate nature of the information industry though, it is really a federal problem, not a state problem.
6. Go all the way and put a data protection board in place in the Federal government. Such a proposal has been made in draft legislation but as proposed it probably would not be effective. Europe would be delighted, and it would avoid a nasty coming problem for the country. The European Community (EC-92) move to create a “United States of Europe” is floating a uniform privacy position that deals with flow of personal information across national borders, and the present U.S. posture in privacy is not consistent with that proposed for EC-92.
7. Institute some form of insurance to cover unusual privacy events or circumstances for which we can think of no other response. But establishing an insurance claim would again equate to showing harm, and it has already been observed that such is difficult to do.
8. Establish by law that an individual has a right of ownership in information about himself – a very unusual approach. Imagine the consequences if every private-sector organization that used personal information would have to acquire rights to or pay royalties for the use of information about oneself.

And just what would be included in personal information? Name – it is already splattered all over public and private records. Address – the same thing. Financial information – government already has a lot of it and some, such as bankruptcy events, are public records; credit industry already has a lot. It would be a retroactive destruction of the industry to impose on it a royalty structure payable to every data subject.

While an interesting thought and attractive in some ways, a royalty approach would seem to fall simply for practical reasons. An information industry, whose stock-in-trade is personal information, has a right to exist under the laws and heritage of this country. Businesses have the right to conduct their affairs in a way economically advantageous to themselves, so long as they remain within the law.

9. Deal with privacy on an industry-by-industry basis. Consider as an example the cable industry which happens to fall under various regulatory acts, one of them the Cable Communications Act of 1984. Section 631 of the Act levies certain obligations on every cable operator, and they generally reflect the provisions of the Code of Fair Information Practices – right to see the record, to challenge errors, to cause corrections, to know the uses made of it. Such an approach could be used for any cohesive organized industry, especially if some law already regulates it.

Of the possible ways to accommodate privacy concerns, no one is really wholly satisfactory but maybe some combination of them can be effective.

10.0 Related Effects

Quite aside from privacy issues and designing protective measures, wide use of personal information has other implications and consequences, ranging from positive payoff to serious and real harm.

  1. There are certain benefits for many individuals from widespread use of personal information. Public surveys that are undeniably conducted statistically correctly indicate that Americans value privacy, but are quick to exchange it for the affluence and life style that this country offers and that utilization of personal information helps to make possible.
  2. There are undeniable annoyances; people are inundated by junk mail and incessant phone solicitations.
  3. In some cases there is harm or near approaches to harm (e.g., the individual who gets held in jail overnight because of an error in police or car records or similarity in names).
  4. Occasionally, there are truly serious consequences for individuals. The worst is the unfortunate movie actress who was murdered by a deranged man who obtained her address through the California Department of Motor Vehicles.

11.0 Privacy as Social Equity

While it is possible to list choices for dealing with privacy, how is the country going to come to grips with the intricacy of the matter and with the public policy aspects?

Here is one construct:

  1. Identify the issue as a social-equity one.
  2. Get the public policy aspects clearly in view and consciously determine them.
  3. Identify the stakeholders and their respective interests; compare/ debate/contrast/compromise among the obviously competing interests and strive for a balance point around which one can think constructively about desired remedial actions such as law.

Obvious stakeholders in privacy include each and every citizen, society at large, law enforcement, the information industry, and government. There may be others; surely all of the private sector will be interested and much of it will want to be involved. A major problem though is to find the forum in which to debate such a complex matter, and how to conduct the debate in an orderly way.

Issues stated in an equity context are well known in the affairs of the country. Frequently the issue of concern is between an industry and organized public interest groups (e.g., utilization of natural resources such as timber or oil vs. conservation groups). But as an equity issue, privacy has some quite different dimensions, notably its pervasiveness throughout all societal processes. It is not accidental that such is the case; information after all has a very central role in everything.

12.0 New Privacy Versus Old

Let us now clarify the new aspects of privacy relative to what we knew privacy to be in 1970s.

One is certainly “use” as contrasted to “abuse”of personal data. Socially accepted uses of information about people certainly must be an explicit component of the privacy debate today, as well as protections against abuse. And the well being of society and the payoff to it must be considered along with the rights and privileges for the individual.

A second new aspect is the question of what protection should be afforded to collections of personal information that are aggregated into a database as a byproduct of conduct of business (e.g., the databases within point-of-sale systems, the financial information in retail store systems).

A third new aspect, created really by the existence of the private-sector personal-information industry, is control of access by third parties, especially access by the many enforcement agencies at federal, state and local levels – law enforcement, tax authorities, drug enforcement, welfare and social assistance programs. But there is also access by lawyers or other components of the judicial system – divorce lawyers, public prosecutors, availability to the discovery process of law suits. And always there is the issue of access by the press.

Who is to be allowed to access and use the databases of the personal-information industry? And what controls or penalties should there be to protect against abuse? Or against access by inappropriate people? Will broad privacy protections drive any underlying requirement for sound computer security in systems?

These points are epitomized in the databases behind all the point-of-sale records that abound in the retail trade. What about supermarket checkout records? How should they be used? Protected? Who may have access to them? And the records of drugstore chains, especially pharmacy records? What controls should be in place on them? And who may use them and for what purpose? Drug enforcement and abuse? Social workers? Insurance companies? Medical oversight boards?

13.0 Context for New Privacy

As general context for contemplating privacy and possible approaches to protection of it in its new and enlarged image, it is worth restating a position first enunciated by the PPSC. Namely:

Privacy must not be the cover or the excuse for violating the law. Conversely, privacy should not impede actions that are in the nature of assuring that there are not violations of the law. The latter interpretation becomes tricky but in fact really is one basis for computer matching at government levels.

And it is also worth pondering a related concept that also was introduced during the PPSC. Putting it into contemporary phrasing:

The net effect of modern record-keeping practices and use of personal information, all supported by current computer and communications art, is to tighten the processes of society. Things we have been used to doing and anomalies that we discovered and exploited in a world of paper systems are vanishing or made more difficult.

A trivial example is the float in bank checking accounts. It used to take several days for a check to clear and people often counted on that time delay to manage cash flow. Debit cards and prompt posting of credit card transactions plus automated check clearing systems have pretty much taken that unintended advantage away.

14.0 Privacy Versus Public Distaste

Put this last point into a current context.

Suppose that some company institutes an information service that tightens some aspect of society, but people do not like it. Or suppose some company introduces a product that exposes – makes more visible to more people – personal information in a way that people will not like (e.g., the Lotus Marketplace: Households product). Unless such actions violate some specific law, such activities are a matter of management taste and concern for society when considering new information products.

Does this country want an oversight function that reviews such things? And perhaps could force them to stop? Or would just publicize them? Probably not, but there are forces that can bring public pressure or publicize (e.g., consumer action groups, consumer advocates within government). In the Lotus case, the electronic mail networks of the country played a major role in organizing opposition.

But to what extent can business or government be compelled to cease-and-desist just because people dislike something? Government will only respond to Congressional action. Business will often respond to consumer resistance, but if a vendor sells only to other businesses and not to individuals, individual and collective consumer resistance may have little effect.

Remember that the private sector is a stakeholder in the equity of privacy. Because of the pervasiveness of the privacy issue and of information usage, would there be unfairness to subject the private sector to generally uncontrolled public pressures? In a country of 250 million people there will always be a vocal group that does not like something, or such a group can easily be whipped together by a few organizers.

Also keep in mind that this country has to exist in an international world. Things that a U.S. population might not like could indeed be important to our relationship with other countries.

15.0 The Future for Privacy

Privacy is a complex issue; privacy is a vexing issue. Solutions to its protection are easy to imagine, hard to implement, and awkward to get into law. As a public policy issue, it concerns us all. Each of us is surrounded by the same record systems as anyone else in the country. We are all subject to the same consequences, to the same possible abuses, to the same annoyances, to the same sense of society’s tightening.

To each of us as professionals the issue is one demanding our responsible and ethical behavior. In its four decades, the computer industry has not spawned any national problems of the kind that the automobile industry allowed to happen with atmospheric pollution, or the power industry has with nuclear waste and acid rain, or the chemical and mining industries have with lake, river and stream pollution. But it takes constant alertness from all of us connected with any part of the information industry to assure that some problem will not arise. As professionals, we can and should be responsible monitors of what is happening, vocal, alert, cautious, attuned to the activity of our own companies, our own state governments, and the Federal government.

We have the knowledge that lets us be more insightful than most of the population. We should see the coming problems sooner; we can sound the alarm and take action sooner. We can exert a certain measure of ethical responsibility in behalf of those who are less informed.

Each of us is part of some industry, and the message is much the same as to each of us as a professional. Be alert; be informed about privacy issues and latent risks and developments. Be vocal and responsible in taking positions on privacy issues related to your technology and your industry. Keep up to date on events that will influence what your industry will or can do.

Resolution of the contemporary and future privacy concerns will take more than the collective will of society. True there are new forces coming into prominence, new technologies for widespread communication and quick organization of positions and groups – the facsimile machine and electronic mail among others. In the end, though, law is almost certainly essential; getting the attention of the political process is always hard but there is homework to be done first. Privacy as a social phenomenon, driven so hard by technology and the exploitation of information, must be understood and its intricacy structured. Until that can be done, we are likely to be taking potshots at problems and consequences that may not be the central problems in the big picture.

RAND Corporation

16.0 References

  1. Paul Baran, “Communications, Computers and People,” The RAND Corporation, P-3235, November 1965. Also: AFIPS Conference Proceedings, Vol. 27, Part II, 1967 Fall Joint Computer Conference, Spartan Books, Washington,D. C.
  2. Alan Westin and Michael Baker, Databanks in A Free Society,
    Quadrangle Press, 1972.
  3. Records, Computers, and the Rights of Citizens, Report of the
    Secretary’s Advisory Committee on Automated Personal Data Systems, U.S. Department of Health, Education and Welfare, DHEW Publication (OS)73 – 94, July 1973. Contains a good bibliography.
  4. Personal Privacy in an Information Society, the Report of the
    Privacy Protection Study Commission, July, 1977. There are also five appendices on specialized topics, including a discussion of the 1974 Privacy Act had been working.
  5. “Compilation of State and Federal Privacy Laws,” The Privacy Journal, Washington, D. C., 1989.
  6. San Francisco Chronicle, July 10, 1991; pg. C1 [Reuters]. Paraphrased in RISKS DIGEST, Vol. 12, Issue 05, July 11, 1991.
  7. Excerpted from the advertising literature of the product announcement by Lotus Development Corporation, plus facts elicited through correspondence with the company and shared via electronic mail.

Information as a Commodity: Control and benefit are Morally Owed to the Source

1.0 Introduction

Willis Ware has done two important things in his excellent track address: first, he has given us a wonderful historical perspective on the contemporary privacy issues; and, second, he has clearly laid out a wide range of alternatives for addressing those issues. By setting everything up so nicely, Ware has now presented the rest of us with the challenge of developing an understanding of the issues which will move us toward their resolution.

At the same time, Ware’s presentation may have the unfortunate side effect of being too neutral, giving the impression that the playing field is somehow level at the start. This then encourages the impression that “everybody” affected by the privacy issues has a roughly equal and legitimate stake in the enterprise of resolving those issues. His nine options may then be seen (mistakenly) as presenting a nine-way intellectual “gridlock,” with each interest just as important as the other, so that the problems are only resolvable at the lowest common denominator. This “gridlock” could then result in downplaying, thus reinforcing, the significant advantage currently enjoyed by the information industry in any resolution process.

My approach will be to argue against any sense of equality or privilege for the industry. Specifically, I will argue that the needs and desires of the information industry must take a subordinate role to the needs and desires of those whose information forms the “raw materials” for that industry. I will then make several suggestions, different from those Ware has made, which I believe will be fruitful directions in which to look for a resolution to the primary issues.

2.0 Reframing the Issues

There would be nothing much to talk about if personal information were worthless. Because it is not, as evidenced by the growth and expansion of the information industry, the question must be how to structure that industry in a manner which allows proportional benefits to the source of information as a primary objective.

The first relevant question to be asked is not “Who has a stake here?” but rather, “What is at stake here?” or, “What is the primary material of production in this industry?” The answer, of course, is first and foremost personal information about individuals, secondarily information about groups or companies. Without information, the “information industry” cannot function, any more than the steel making industry can function without iron ore.

Recognition of this point is crucial, because it reframes the issue. For there is no other major industry which can obtain its material of production for free, and generally without the knowledge of the individuals who are the source of that material. There is no other industry which can so cavalierly define itself and its materials to avoid responsible recognition of their material’s source. As a result, the information industry’s position on privacy issues, as described by Ware, is untenable.

The only possible way the industry can avoid this untenable position on privacy is to accept one of three alternatives: deny that it is an industry and/or that information is its basic material, thus the critique does not hold; or, accept in principle the critique and its legitimacy, but reject its application for some reason other than its correctness; or, finally, prove that information as material is so different from other materials of production that the critique is not valid because it simply does not fit.

The first alternative is obviously absurd given today’s information industry, although as Ware pointed out, 20 years ago it would have been plausible. That alternative will thus be rejected out of hand. The second alternative has some promise, and in fact underlies the industry argument that nothing should be done about understanding or resolving the privacy violation problems because the cost to the information industry would be too high. This alternative will be discussed in Section 3 below. The third alternative has more promise, and in fact underlies industry claims that most relevant information is actually “public” thus does not admit of the same treatment as other types of materials, such as iron ore, which are “private.” This alternative will be broached in Section 4 of the paper, although its complexity will require much fuller treatment.

Before we go further, however, it is necessary to distinguish between two categories of information, original and derivative. Original information is that which is usable in its presenting form and which derives from the directly from the source. Such information, which may be “public” or “private,” has as its fundamental characteristic some fact(s) about the source. Derivative information, on the other hand, is that gained by inference, extrapolation, statistical analysis, or other manipulation, or interpretation, of original information. Ware’s example of CNI is an instance of original information, as is a mailing list compiled from mail order customers. His example of statistically projecting bill paying habits is an instance of derived information, as is projection of demographic trends from current census data.

The importance of this distinction is that all arguments concerning information acquisition must first be made concerning original information, since without such information, the derived is of little consequence or concern. Additionally, derived information raises concerns peripheral to the privacy issue, such as accuracy of predictions and legitimacy of predictive information in factual decision making. This paper will focus exclusively on considerations relevant to original information.

3.0 The “High Cost to Industry” Defense

The single most common defense raised by businesses against any proposed reform is that it will cost too much. In fact, the de rigueur argument, no matter what the proposal, seems to be as follows: “All increasers of the cost of doing business are unacceptable; All reforms of business practices proposed from outside are increasers of the cost of doing business; Therefore, All reforms of business practices proposed from outside are unacceptable.” It is thus not surprising that information industry representatives moan loudly about any possible reforms in their practices. This is especially so when those reforms suggest reimbursing information sources for utilizing information about themselves.

To fully understand what is at issue here, we need to compare the information industry to other industries on two counts: first, the industry’s need for “raw materials” in order to operate; and, second, the normal industry recognition that the supplier controls that which s/he supplies.

3.1 Raw Materials as a Cost of Doing Business

It is important to note that the cost increase argument is fundamentally different when comparing the information industry with others, since unlike other industries, the information industry does not pay original sources (the person or entity the information is about) for their material resources. In any other industry, the originating source for all required materials is part of the compensation structure for that industry. Even if, as a steel producer, I used pelletized iron ore, bought from the processor, not the mine, the miner is still compensated for the ore which the processor pelletizes before selling it to me for steel production. Why should the information industry be exempt from this normal state of affairs? They surely cannot argue that the material (information) has no value, for their very existence belies that claim. It may indeed have no intrinsic value, but it clearly has market value. Why then does the industry claim the contrary?

The material needed for operations in other industries has clearly identified costs for the original source; e.g., buying the land under which the oil is discovered, drilling the well, buying recovery and storage equipment, etc. Reimbursement for such costs, at market rates, of course, is accepted as a cost of doing business. Perhaps the information industry believes that it can have legitimate access to information, without cost, because they do not acknowledge any “production” costs for the original source. For basic information, the question would be, “Where is the cost?”

A name costs nothing, so why can it not be used for free?

While interesting, that position is suspect because it makes an unwarranted dichotomy between information and its source, i.e., my name and me. It fails to recognize that my name, of necessity, names me. My name is thus not an abstract entity without any “attachment.” My name, would in fact have no value to the information industry were it not specifically attached to me. Unlike my name, I cost my parents a fair amount, both in original production charges (pre-natal care, delivery, hospital charges, etc.), and in post-production development (food, clothing, housing, schooling, etc.); but there have also been significant costs since then. Without my job, and the income it produces, I (via my name) would be valueless as information. Thus the costs of obtaining my degree, the 18 years experience in achieving the position I now hold, which makes my name valuable to the industry, are all costs that I had to pay for the material the information industry needs for its operations. On the other hand, it is ironic that the only sure way to stay out of most information data bases is to become poor and homeless.

Without belaboring it, there are costs to me for the material that the information industry believes it should have for free. Thus, at least without a substantial argument to show that those costs should not count in their determination of basic value, there is no obvious reason why the industry should be allowed to discount these costs. Importantly, the burden of proof is on the industry to prove its claim, not on me to defend against it, since the material, because it is about me, is mine to begin with. The alternative claim, that just because information is about me it is not mine, is just plain silly.

Another reason for the apparent dissimilarity between the information industry and others may be that it is hard to get iron ore from someone’s mine, or oil from their wells, without their knowledge (at least in large enough measure to effect any significant result). And if one does so, it is without question recognized as theft, as for example when angle drilling from one’s own property into the oil reserves underlying another’s property, or short-weighing ore delivery trucks. On the other hand, information is most often easily obtained without the original source’s knowledge. But does that mean it is legitimately free? The industry seems to think so. Yet, on analogy, if I leave the gate to my oil storage facility unlocked, does that mean anyone may help themselves to the oil, at no charge? Clearly not; so why then is the moral status of the industry’s actions to be seen any differently? The burden of proof lies with the industry to show that their position is correct before that position is accepted as legitimate.

Finally, if someone is asked to give materials for free and does so, then there is no recourse for later claiming theft or royalties, unless of course the agreement was tainted by lies, misrepresentation, scare tactics, etc. As a result, industry in general understands that it has no real choice but to pay for its required materials, unless they can get them donated, which is unlikely. The information industry, on the other hand, operates as if its required materials had been donated, because the industry has defined for itself unique notions of privacy and legitimate access to original materials. But more on this in a moment.

3.2 The Cost of Correcting Past Inequities as a Cost of Doing Business

That the industry already exists and has significant resources tied up in its operations is not totally irrelevant, but neither is it sufficient to accept the status quo, unless a case can be made that the material needed for their product, the original information, was legitimately obtained. To do otherwise is like allowing drug kingpins to retain the profits from their illegal activities after they have been convicted. What makes the information industry even worse is that they get to retain not only the profit but the product. For all they do is sell a copy, retaining the master for theoretically infinite resale. On analogy, the drug kingpin would need to be able to retain both the drug and the income, which is of course impossible. The information industry is thus in a unique situation which must itself be justified, not simply used as the reason for maintaining the status quo.

More importantly, there is a strong underlying assumption in this country that past inequities should be redressed. We may argue about exactly what the inequities are, or how they should be redressed, and by whom; but we do not question the basic assumption. In fact, that assumption is the foundation of both the torts system and civil rights enforcement in the United States. As a result, the recognition of past inequities in the information industry’s obtaining of its raw materials should lead to corrections as they would apply to any other industry.

3.3 Originator Control of Materials as a Feature of Doing Business

In all other industries, the original source of materials is also recognized as the controller of those materials, as so clearly evidenced by the cost and availability of crude oil, and its effect on every imaginable segment of industrialized society.

Granted, in the capitalist system, the market drives the demand, which in turn drives the cost. Yet that can all be heavily manipulated by the originating source, thus demonstrating control as a recognized entity in the process.

Once again, however, the information industry has made an exception of itself, most likely for two reasons: first, as noted above, it is easy to obtain information without the original source being aware; but, second, and more important, it is most likely that if the industry had to obtain consent they would have significantly less information to utilize, since there would be a considerable level of refusal. After all, how many people would consent to have their name included on a pyramid listing for junk mail? Yet to exclude such consent would be similar to not obtaining informed consent for participation in medical research, on the grounds that doctors could get more research done if the subjects did not know they were being used for research purposes, because then patients could not opt out.

To require that the information industry recognize the control of the original source of information would crimp their current style of operations. But in all other industries this must simply be accepted as one cost of doing business; why should the information industry be exempt? The strongest argument at this time seems to be that the information is in fact “public,” thus available for anyone to use as s/he sees fit. That argument must now be examined.

4.0 The “Private” is “Public” Defense

Ware has pointed out the historical origins of the information industry in financial activities. The underlying basis of those activities, to assure good information and prevent unnecessary credit risks, thus significant losses, seems reasonable enough (although the current S & L crisis surely gives pause). And although abuses in the financial information industry have led to regulation of those activities, the conception of information sharing which those activities spawned remain unregulated.

At the heart of those activities is an assumption by the industry that claims to privacy are mitigated by two sets of circumstances: first, that the original source has “voluntarily” relinquished the information; and, second, that the information in question is made “public,” usually by entry into some governmental record, such as a court proceeding or legal filing.

4.1 Private Information and “Voluntary” Release

4.2 Private Information in the “Public” Domain

The second mitigating circumstance, the “public” record, is much harder to attack, requiring at least in part philosophical arguments concerning the nature of personhood. However, it is possible to make several suggestions which throw serious doubt on the industry’s claims to unlimited access.

To begin, when information appears in a governmental domain which is not by law legally defined as confidential, the industry reads “government” as “public,” and deems its access and use of the information to be appropriate. Just because the residing place of information is governmental, however, it does not follow that the information may be taken from that place and used privately, for commercial purposes. To do so is to confuse utilization with access. The earlier example of unlocked oil storage is an appropriate consideration here; the unlocked gate does not authorize the taking of oil, regardless of whether it is subsequently sold or used personally.

Another example is a patient’s hospital medical record. There has been repeated legal verification of the conceptual distinction between record and information; the record may belong to the hospital, but the information the record contains belongs to the patient. In the current consideration, the record may be “public” in the sense that it belongs to a government agency, but the information in that record still belongs to the individual. Moreover, the privacy of medical records has been repeatedly upheld, even for “public” hospitals. Since there is extensive common information in both types of record, there appears to be no grounds for automatic exemption of the information industry from the commonly accepted standards of privacy.

Yet another example is personal. Our local newspaper publishes a special listing, in its Saturday edition, of the owner’s name and address for every house sold with a sale value over a specific amount. My wife and I did not realize this until we bought our house, and could not figure out how all our coworkers knew exactly what we paid for the house. We asked how they knew and they told us. When questioned about this, the newspaper’s only response was “The information is a matter of public record, thus we may legitimately use it however we see fit.” I was never able to get a justification for the publication, however, only a continuous reassertion of the public nature of the information.

Importantly, there were immediate results from the publication. Solicitations by phone and mail increased phenomenally. For weeks, not a single dinner hour went by without at least one call, frequently several. One would think word was out that despite the house’s price, it was infested with every known vermin, minus windows, studs open to the weather, missing roof, no heat or air conditioning, faulty wiring, unlandscaped yard, etc. At least some solicitors, including accountants, insurance agents, car dealers, investment advisers, and merchants of all sorts, chose the mail, of which there was often so much the mail carrier had to strap the excess to the outside of the box!

The point of these examples is to question the industry claim that information is “public” on the grounds that it misconstrues the nature of records by confusing the physical record and the information it contains. Moreover, it is to suggest that the utilization of such information is far from benign for the original source. On both counts, a case can be made either for not allowing use of the information or, if used, compensating the original source.

4.3 Autonomy as the Basis for Understanding Privacy

The key to working on the privacy problem is not to ask “How much regulation do we want?” but “What should be the moral basis for resolving the problem?” Ware has suggested the principle of justice, through the concept of equity. I would like to suggest that justice must be supplemented by autonomy as well. For autonomy is a fundamental principle of ethics without which justice would have only a formal, no practical, development.

The primary aspect of autonomy is the recognition that a competent individual has primal authority over him/herself. In short, s/he is considered to be self-determining. The agreed limitation to this self-determination is the notion of harm, such that my right to act as I wish must not harm another. This may appear problematic when the industry claims harm by not being allowed to continue business as usual. However, note that such a claim requires ignoring the fact that the industry already has the information, and received it illicitly. Thus individuals have a counter claim that they have previously been harmed by the industry. Granted, as Ware has indicated, clearly defining “harm” is problematic. The key, however, will be the recognition that the definition, to be legitimate, must come from the individual, not the industry. For the industry would, of course, wish to define “harm” in their best interests, which is inappropriate to the primacy position of the individual. To do otherwise is to inappropriately continue the industry advantage.

The interface of privacy and autonomy occurs in the recognition that one of the autonomous decisions each of us reserves to ourselves is the determination of what we allow others to know about us. This notion is rooted in fundamental metaphysical considerations about the nature of personhood, exemplified by Wasserstrom and others. What must be kept in mind is that degradation of privacy is a degradation of respect for persons and a diminution of their status as autonomous beings. In fact, it makes no sense to say, as we do in tort law, for example, that no one may access my body without my consent, if information about my body is not similarly protected.

5.0 Toward a Resolution

Unlike Ware, I do not believe that these problems can be resolved by the computer professionals in the information industry, no matter how well intentioned. Certainly, there can be no resolution without their participation, because of the insight, knowledge and responsibility that Ware notes. But it is a well known fact that it is difficult at best for employees to effect changes in company policy, especially when those changes are dictated by moral considerations. At the same time, we must keep in mind Margaret Mead’s observation that we should “Never doubt that a small group of thoughtful, committed citizens can change the world. Indeed it is the only thing that ever has.”

At a practical level, it would seem straightforward to begin closing down some of the questionable uses of information. The easiest would be to start with unauthorized multiple distributions of information. Initially, legislation could prohibit the distribution of any information for which the original source has not specifically consented. Based on the medical model of consent, this would need to be positive (e.g., a signature, or formal declaration of consent is required) not negative (e.g., if I don’t send back the card I have been assumed to have consented).

Those selling mailing lists, or information to data bases, would then have to give the original source both the option to consent and the option to levy a fee on the user. In a true free enterprise system, I would set the fee myself, leaving the industry user the option of accepting my information, and the fee, or not using my information. The fee system could be set up much like the existing copyright clearing house fees. Those choosing to participate could then be paid their “royalties” in a timely fashion.

Such procedures should be initiated in steps, but must have provisions which do not permit the “grandfathering” in of information gathered before the legislation, unless retroactive permission is obtained. This could be done, for example, by each data base mailing out business reply cards to everyone on their list, with only those returning cards remaining on the list beyond the expiration date. If the fines were large enough, compliance would be assured, especially with vigorous initial policing.

Granted, this would impose significant problems on the industry, but as noted above, that is not sufficient to deter taking action. What it will do is force the industry to engage in good old competitive marketplace considerations of whether all that information is really worth as much as they thought. 12,000 names for $175 is of no concern; would each of those names be worth $1.00 though? $5.00? $15.00? Only the competitive market will tell.

Such legislation is admittedly a form of retributive justice for the industry’s past offenses against the public, both by violating their right to privacy and by causing to be foisted upon them tons of mail and hundreds of phone calls which they did not want. As noted above, however, there is no prima facie reason why this is not appropriate.

There should also be formal redefinition of “public” documents to assure that the information they contain may be accessed by the public, but not used by the public. In that way an eager insurance salesman could read the public record to know that I bought my house, how much I paid for it, etc., but could not use that information for his own business purposes without the permission of the original source, the home owner.

Nothing that I have proposed here would eliminate blanket mailing or solicitation, but it would cut down significantly on targeted approaches. More importantly, it would greatly reduce the haphazard development of informational databases, because the costs would be too great. Those that were developed would thus be more carefully thought out, and ultimately more valuable, both to the industry and to the original sources.

6.0 Concluding Thoughts

By expanding upon Ware’s comments, I hope that I have given more focus to key elements of the privacy problem. This is not to disagree in general with Ware, but only to push his beginning in new directions.

Most important, Ware is to be commended for his tireless efforts (beginning in the 1970’s) to warn us that as individuals, regardless of our professions, we must pay more attention to these matters. The latent risks are enormous. As technology continues to develop, the potential for expansion is only limited by the imagination. Abridgment of privacy is not a trivial matter, and we would do well to heed Ware’s warnings.

Information is not benign. As more becomes available, its potential negative impact increases. Already, for example, we are seeing employers attempt to gain medical information which is irrelevant to the employee’s work, only to reduce payouts from the company health insurance policy. Already we see discrimination on the basis of information, for example sexual preference or religious beliefs. Already we see discrimination based on projections and statistical interpretations of personal financial data. The list could go on and on.

How far we want to go is, as Ware says, up to us. But each of us has the responsibility, as autonomous moral agents, to make intelligent, well informed decisions. Anything less not only plays into the hands of those who misuse personal information, but in the end will mean that George Orwell was right, just a few years off.

University of Oklahoma, Health Sciences Center

Comments on Willis Ware’s “Contemporary Privacy Issues”

Loftus E. Becker, Jr.

Willis Ware has well summarized the major issues regarding one aspect of privacy in the new information age. My intention here is to broaden the playing field a little, and add a few remarks from a lawyer’s perspective.

We live in a time of increasing paradox. On the one hand we are driven to recognize that knowledge is power. On the other hand we are drowning in a sea of information. Medieval scholars could memorize the few books available to them. Few of us have time to read even the abstracts of articles that bear on our interests.

The same is true for those who might (for fun, profit, or malice) want to know the details of our lives. As Ware puts it, the “the completeness of the dossier that can be developed” from information out of our control is terrifying. But most of us are protected for the moment by the fact that nobody cares enough to spend the time and money to develop it.

In devising more permanent controls on private information, we must recognize that we face not one problem but two. On the one hand we need to give individuals adequate control over what Erving Goffman has called the “presentation of self.” On the other, we need to assure that in limiting the dissemination of personal information, we do not by those limits create an “information elite,” a privileged class to whom (and to whom alone) this powerful information is available.

To take a current example, I do not myself know how much information about Clarence Thomas’s life and background should be available. But I am quite sure that it should not be the province of a privileged few. The dangers of limiting information, and the power it gives, to a few individuals and groups can be as great or greater than the dangers of making the information available to all.

Of course the first problem we face is deciding what kind of information we want to protect as “private” or “personal.” But this is only half the battle. It is equally difficult to devise mechanisms that give real and not just theoretical protection.

To begin with, in the foreseeable future governmental agencies alone can give only very limited protection. American governments are starved for funds. We’re closing our public libraries and hospitals. Agencies from the IRS to the Environmental Protection Agency lack the staff and money to do what they’re charged with doing. An Information Privacy Agency, however well intentioned, will not fare any better. For effective protection, we will have to devise schemes that mix governmental power with private resources.

One good starting point would be a registration and disclosure requirement for dealers in personal information. Obviously such a requirement would have to be carefully written: too many small businesses have to file too many unread reports. But there is little reason that the major players should not be required to report, publicly, the kind of information they sell or rent, and their rules (if any) for access to that information. Only when we know what’s actually going on can we start to think clearly about how and whether we want to change it.

Second, we should remember that the law is a blunt axe. It works best when its rules are few and simple. Consider the sales tax. State legislators produced sales taxes of exquisite discrimination, so that (for instance) in New York State, Prell Shampoo is taxed (as a cosmetic), while Head and Shoulders (a medicine) is not. The trouble, of course, is that drugstore clerks can’t and don’t carry the rule book in their heads. So with regulations concerning the use and dissemination of personal information. Perfect rules won’t work; we’re going to have to settle for imperfect, but clear and understandable ones.

Third, although Willis Ware is undoubtedly correct in saying that, ultimately, nationwide standards will be necessary, I think we should be slow to adopt them. Put bluntly, we just don’t know enough about what’s happening or what effect any proposed rules will have. One great virtue of our federal system is the ability of the states to serve as, in Justice Brandeis’ words, little “laboratories for experiment.” Letting the states try out a variety of rules will in the long run serve us better than moving too quickly to a uniform federal solution.

Fourth, we should not be too quick to back away from protecting privacy even when the protections serve to shelter crime. No doubt the recent horrifying murders in Detroit could have been avoided if, in 1984, we had put television monitors in everyone’s bedroom. But few of us would be willing to give up that much of our remaining privacy. The extent to which protecting privacy will hinder the prevention and detection of crime is, assuredly, one of many factors relevant to deciding how much protection to give. But it is only one such factor, not a talisman in whose presence the right to privacy disappears.

Finally, if we want to do it there is neither a conceptual nor a legal bar to creating at least a limited property right in personal information about oneself. Babe Ruth’s name, and Charlie Chaplin’s image, are known to millions; but anyone who markets a “Babe Ruth Baseball Bat,” or advertises her product with a Chaplin look-alike, will quickly find that is no defense to an action by their heirs. Similarly we could, if we so desired, forbid the selling of mailing lists and other bits of personal information about an individual even though that information may be on record from a variety of sources.

Nor does the information industry have a prescriptive right to continue to exist. No person has a right to continue a business reasonably believed detrimental to the general welfare, as the manufacturers of DDT and fluorocarbons well know. Please note that I am not saying we should establish such a right, or that the information industry should be shackled or destroyed. I am saying only that the proposed right is not unusual in the law, and that should we ultimately decide that the business of selling personal information does more harm than good, the fact that it already exists is no legal bar to its regulation or abolition.

University of Connecticut School of Law

Rationale for the Proposed Revision of the Association for Computing Machinery’s Code of Professional Conduct

Why an Ethics Code?

Professional groups are both technical and moral communities because in order to be self-regulatory the members must set shared goals and specify appropriate ways to achieve them (Camenisch, 1983; Frankel, 1989). In order to specify these appropriate standards it is necessary to detail what types of behavior are ethically acceptable or not.

As the computing community has evolved, so have new ethical problems such as intellectual property rights and the monitoring of electronic mail. During the past few years there has been growing pressure to refine and disseminate standards of ethical computing. The rise in computer-related crime has generated new calls for computer ethics. Similarly the widening impact of computing upon society has fostered new concerns for ethical and social responsibility. Indeed the new ACM/IEEE-CS curriculum guidelines, Computing Curricula 1991 contains an entire section on “social responsibility” (Joint Curriculum Task Force, 1991).

Most societies of scientists and engineers have codes of ethics or professional conduct despite philosophical and political arguments against them. The philosophical critique is based on the inconsistency between an “autonomous professional” and binding rules in the form of an ethical code. For this reason Ladd (1980) calls ethical codes morally confusing. The political complaint against codes of ethics is that they can become excuses for avoiding personal or collective action (Chapman, 1990). For instance, a company’s code of ethics may specify the rights of the company to restrict an employee’s behavior, even if that constraint is unfair.

Far more compelling are the arguments in favor of codes of ethics for professional societies. Frankel (1989) offered a long list of these arguments for positive functions of a code. The three most important functions of a code, he notes, identify different types of codes. Codes are either primarily (1) aspirational, giving ideals to strive for, (2) educational, intending to educate or socialize some constituency, or (3) regulatory, hoping to sanction violations of the standards. Most codes are intended to achieve all three aims to some degree, but a careful examination may reveal a concentration upon one of these. The most recent codes of the Institute for Electrical and Electronic Engineers (IEEE), the International Federation for Information Processing (IFIP), and the American Society for Information Science (ASIS) codes are good examples of aspirational foci. The International Society for Technology in Education (ISTE) Code of Ethics appears to be primarily educational, whereas the prose of the current Institute for Certification of Computer Professionals (ICCP) and the Association for Computing Machinery (ACM) codes of professional conduct suggest primarily regulatory purposes. (p73). With regard to the aspirational emphasis, Frankel points out the importance of a code as “enabling document,” that is, a guide to help the professional make more informed, wise decisions. Within this framework, the code works toward the collective good even though it may be a mere distillation of collective experience and reflection.

An educationally oriented code emphasizes the needs for professional socialization. The significance of this function is not only the training of new recruits but the solidarity of the initiated. A code of ethics helps to reinforce professional identify and allegiance, as well as commonly held group values (Frankel, 1989).

One of the most important educational functions of a code can be its clarification of why one should follow specific ethical imperatives. An ethical or professional code of conduct may convey many “should nots,” but if it also gives the moral rationale for each ethical admonition, it will be far more effective. This educational function can be even greater if the code offers help with setting priorities for moral and professional conduct.

The regulatory purpose of a code is accomplished to the degree that it deters unethical behavior. It is sometimes assumed that this cannot be fulfilled unless penalties are spelled out and a code enforcement operation has been installed. However, the code may be effective simply by suggesting an appropriate sanction for any given violation, and by requiring professionals to report errant colleagues. In the latter case, each professional assumes a responsibility for upholding the group’s integrity.

All codes of professional ethics help to offset the inevitable tension between a profession and its public. A code becomes the profession’s self-proclaimed values and defines its role in society. It holds the profession accountable to the public, but without making unreasonable demands. Thus it serves as a basis of both public evaluation and professional accountability. This tends to yield a major payoff in terms of public trust. Frankel argues that “to the extent that a code confers benefits on clients, it will help persuade the public that professionals are deserving of its confidence and respect, and of increased social and economic rewards.”

Another positive function of a code of ethics is its support of professionals against unwarranted erosion of their stature and power. Martin and Martin (1990) described how self-regulatory procedures can be withdrawn by the government, and Frankel noted the power of a code for constraining overly demanding claims from clients.

Lastly, a code can be extremely important for a professional group because of the process it must go through to establish a code. Getting a large number of people attentive to ethical concerns and then organizing discussions around these issues, can yield surprisingly greater sensitization to issues of ethics and the profession.

Background of the ACM Code of Professional Conduct

The current ACM Code of Professional Conduct was developed between 1969 and 1972 and adopted by the ACM Council in 1973. At that time there was no such thing as international data communications networks, and phrases like ‘computer virus’ and ‘computer inequity’ were unheard of. New p73 technology and new ethical issues quickly outdate the code, but the substance of ACM’s Code has not been revised in 20 years. In 1990, the ACM’s SIGCAS (Special Interest Group on Computers and Society) decided to address the need to review and revise the ACM Code. A grant from the ACM SIG Discretionary Fund launched the “Ethics Project” in the summer of 1990. A small task force to revise the code emerged later.

The project held its first public meeting on September 9, 1990. Over 50 people participated in this all day SIGCAS Ethics Symposium. The symposium included reports of research on computer ethics as well as presentations on different approaches to teaching computer ethics. But the main accomplishment was the delineation of a large, diverse set of issues in computer ethics and the clarification of many problems underlying revision of the ACM Code.

In March 1991 during the ACM meetings in San Antonio, an open forum was held to collect additional input and to determine the next steps. From this meeting emerged a consensus on several procedural and policy issues. Prior to this meeting we solicited comments and suggestions from ACM officers and specialists in computer ethics.

At the June 20, 1991 meeting of the ACM Council, the members were polled on their opinions regarding possible revision of the ACM Code of Professional Conduct. In general there was agreement on the directions to revise and expand the Code. The ACM Council is the most important organizational unit with respect to the revision process because the ACM Constitution (Article 6, Section 8) states that the ACM Council “shall adopt, maintain, enforce and conspicuously publish and display to all members and the public a code of professional ethics which shall be binding on all members.” One way it fulfills this function is by requiring that new membership sign that they subscribe to the purpose of ACM “to develop and maintain the integrity and competence of individuals engaged in the practice of information processing.”

The next step is the public review process took place during the week of August 12 – 16, 1991 at the National Conference on Computing and Values (NCCV) in New Haven, Conn. During this week many long and lively discussions were held to revise the first draft of the revision of the ACM Code. The task force has assembled the numerous suggestions from these individual and public meetings in its latest draft (draft No. 18) of the revision.

An Approach to Revision of the Code

To begin the task of drafting a revised code of ethics, we collected the codes from similar professional associations including IEEE, IFIP, ISTE, EDUCOM, ASIS, ICCP, and the Data Processing Management Association (DPMA). In general we found these codes, including ACM’s, were deficient in several ways. They tend to be difficult to read, excessively impersonal, needlessly abstract, possess little sense of priority, more negative than positive, proactive, and to forget the autonomy of the individual professional, to neglect audiences other than employees, to offer little rationale, and to omit references to moral principles. While it is impossible to overcome all of these deficiencies, in drafting a revision of the ACM Code we sought to minimize them. The directions taken are outlined below.

  1. Inclusiveness. All of the basic ethical principles from the ACM Code were retained in the draft revision. Additional ethical imperatives were added from the codes of other professional computing associations, such that the draft revision encompasses all these codes with these exclusions: teacher-specific items in the ISTE Code; several employer-oriented items in the DPMA Code; and some items in the IFIP Code pertaining specifically to international law.
  2. Semantic simplicity. We found nearly universal consensus that Bylaw 17, the ACM Code of Professional Conduct, should be restated in more informal, less difficult language. The language, structure and format were borrowed from the ABA Code of Professional Conduct. However, in 1983 the ABA changed its Code to “Model Standards,” which include the “Model Rules of Professional Conduct” and a “Model Code of Professional Responsibility.” Neither statements use the categories “Disciplinary Rules” and “Ethical Considerations” as a structure. In revising the Code, both the structure of the code and specific phrases have been extensively revised in order to make it simpler and more readable.
  3. International, Multicultural Orientation. Perhaps the most thorough code of ethics for computing has been drafted by the IFIP Ethics Project. Under the leadership of Harold Sackman, IFIP released an official Draft Code of Ethics. This IFIP Code offers a model for resolving some of the issues that face the computing community, particularly trans-national issues. The draft revision of the ACM Code takes an international rather than a national perspective, although the term “international” is not used extensively.
  4. Personalizing the Code. Some codes are unreadable except by lawyers or authors of constitutions. Filling a code of ethics with formal expressions, such as “I shall” in our judgment emphasizes the theoretical and exaggerates the impersonal. We believe that the codes should be written in the first person, so that the statement implicitly becomes a personal ethical commitment. Other forms of expression imply that the principle applies to others and not necessarily oneself. In the revised draft of the code the first person expression, “I shall…” is stated only once within each section. Thus each principle is stated both as a personal covenant or vow and as an imperative or directive. The existing ACM Code is highly formalized in expression, and its language is more consistent with the official ACM Bylaws. With the proposed change in form of expression, the ACM Council may choose to publish it separately rather than as one of the bylaws as the Code is now.
  5. Minimizing the negative. Many ethical statements are “thou shalt nots,” but not all must be expressed in the negative. We have attempted in a number of instances to express the imperative in positive or proactive language.
  6. Recognition of the autonomy of the individual professional. If an ethical principle is stated in an explicit form with all of the conditions completely specified, there is no discretion left for the individual professional to make an ethical judgment. It is impossible to write an ethical code for computing that is completely definitive because of the rapidly changing nature of the field. Some authors such as Wolfson (1990) have attempted to move in that direction, but in this draft we have avoided trying to specify all the relevant conditions to applying any given ethical principle. We believe that what is most needed at this time is a clear, straightforward statement of the basic ethical precepts for computing. This approach is the one most compatible with the concept of the professional as autonomous and professional groups as self-regulatory.
  7. Focusing upon moral principles. Typical ethical computing codes are so cerebral that they may implicitly give us excuses to ignore the ethical issues. By neglecting morality and human emotion, they seem to trivialize the consequences of ethical violations. In drafting a revised code we attempted to rectify this problem by organizing the main ethical principles around more basic moral principles. Given this approach we have called the draft a “Code of Ethics and Professional Conduct,” whereas the existing code is called a “Code of Professional Conduct.”

Structure of the Code

The draft code is divided into four sections with the first section giving a set of general moral considerations, the second identifying additional ethical principles applying to computing professionals, the third section pertaining to organizational leaders, and the final section dealing with issues of general compliance with the code.

There are some important considerations embedded within this structure, especially some commonly held notions of priorities. The first section contains items that are implicitly given higher priority because they are closely linked to moral considerations. The imperatives in the other three sections are not devoid of moral or ethical considerations, but involve additional types of considerations. Within each section the individual items are ordered to some extent with the more critical ones appearing before the less important ones. We do not know to what extent it will be possible to come to consensus on these priority rankings, but within each section the ethical premises generally were ordered in descending priority.

Remaining Issues

  1. Audience. An exhaustive code is impossible, and some feel that a series of specific codes should be developed by ACM for specific audiences, e.g., students, researchers. Members of the ACM Council were asked “Should ACM develop different ethical codes for different audiences or a single one?” A majority responded that it should be a single audience, however, several members expressed a desire for a separate code for students. Several items in the draft mention students, but we still need to be convinced that the development of a special code (or section of the code) for students is necessary. A number of ACM members expressed an interest in ethical codes for organizations. This need has been addressed in section three of the revised Code, which includes a set of ethical principles for individuals who are in “decision-making roles in an organization.” Thus while the entire code is written from the point of view of individual persons, the principles in section three, deal explicitly with those ethical problems that are organizational issues.
  2. Ongoing Maintenance of the Code. A code of ethics must evolve and contemporary issues emerge or change. Many ACM members expressed a desire to have ACM establish a mechanism whereby ethical and social guide lines or “opinions” are regularly issued, redrafted, and reissued. The American Medical Association provides an example of such a mechanism; their Council on Ethical and Judicial Affairs annually releases its Current Opinions, which includes the “Principles of Medical Ethics” as well as more detailed “Opinions on Social Policy Issues.” ACM needs a revised organizational structure for an ongoing review, reformulation, interpretation, and application of its Code of Ethics and Professional Conduct. As part of the project to review and revise the ACM Code, this question will be investigated carefully and some specific proposals will be made to the ACM Council.
  3. Enforcement. The issue of appropriate sanctions and enforcement of the Code has been a statement for the ACM Code of Professional Conduct. So much attention has been placed upon these issues, that the content and general use of the Code has been neglected. We found considerable disagreement on how to deal with the enforcement issue. Some recommended that the Code be written for the purpose of stating and clarifying the ethical standards of the majority of members of the Association. Consistent with this goal is for the Code to provide a model for other associations and organizations of all types to use to formulate their own ethical criteria. Others still believe that for ACM’s Code of Professional Conduct to be worth while it must have “teeth” and the ability to sanction those who violate it. The dilemma is that the more enforceable the code, the greater the potential liability. In the past, ACM has not been willing to incur any serious amount of legal liability in administering its Code of Professional Conduct. The question for the ACM is whether or not to continue that policy. The ACM Council, in response to the question “Should ACM attempt to establish a more enforceable or a less enforceable code of professional conduct?” did not agree on enforceability. It should be noted that in the discussion before the survey, it was brought to the attention of the Council that the ACM Constitution (Article 6, Section 8) states that the ACM Council “shall adopt, maintain, enforce and conspicuously publish and display to all members and the public a code of professional ethics which shall be binding on all members.”
  4. Administration and Education. An ethical code generates administrative responsibilities over and above enforcement procedures. One of the major functions of the ACM Committee on Professional Standards and Practices has been its information services or “ombudsman” role for individual ACM members who face ethical problems. These problems often have involved whistle-blowing dilemmas, employment contract questions, and product liability issues. Perhaps these information needs should be handled by another membership services unit within ACM, but it would appear that the need is greater than has been recognized in the past. The question is how can ACM organize to deal with these membership needs more effectively.
  5. Development of Positive Incentives. The Task Force working on the revision of the ACM Code has been asked by the ACM President, John White, to develop recommendations to the ACM for establishing positive incentives for ethical behavior. For instance, the ACM could institutionalize an annual award for “ethical computing.” The Task Force will make its recommendation to the ACM in 1992.


Creating, applying, and updating an effective code of ethics is essential for any association of computing professionals. But the ethical challenges of the future may be so great that present concerns and procedures may have to be altered radically in order to reduce the distance between technological “can dos” and ethical “shoulds.” Already there are hints of radically new issues. For instance, an organization in Eastern Europe has claimed to release a new computer virus every seven minutes, but as yet there is no international criminal justice system to cope with such global threats (6). Another new vulnerability derives from the digitization of all kinds of information. More and more sound and video material is getting digitized, yet there are no techniques for tracing fraudulent alterations of these creations. Digitizing combined with virtual reality techniques, which produce experiences where fantasy and reality are indistinguishable, could create entirely fabricated “events.” Enforcement institutions can not keep pace with such rapid technological change. All we have to fall back on are people and their ethical standards.


  1. ACM/IEEE-CS Joint Curriculum Task Force, “Computing Curricula 1991,” Communications of the ACM, Vol. 34, June 1991, pp 68 – 84.
  2. Robert F. Barnes, “The Making of an Ethics Code,” Bulletin of the American Society for Information Science, August/Sept, 1990.
  3. Paul F. Camenisch, Grounding Professional Ethics in a Pluralistic Society, Haven Publications, 1983.
  4. Gary Chapman, Comments at the ACM/SIGCAS conference Computers and the Quality of Life, September, 1990.
  5. Mark S. Frankel, “Professional Codes: Why, How, and with What Impact?” Journal of Business Ethics, Vol. 8, February/ March, 1989.
  6. Kenneth R. Hey, “Techno-Wizards and Couch Potatoes,” OMNI, Vol. 13, August 1991, PP 51ff.
  7. C. Dianne Martin and David H. Martin, “Comparison of Ethics Codes of Computer Professionals,” Social Science Computer Review, Vol. 9, 1990.
  8. Donn Parker, Report of Project on Ethical Conflicts in Information and Computer Science, Technology and Business, Stanford Research Institute (SRI), 1988.
  9. Rosenberg, Richard S., “Ethics and Professionalism” in The Social Impact of Computers, Academic Press, 1991.
  10. Joel Rothstein Wolfson, “A Code of Professional Responsibility – An Ethics Code with Bite” (paper presented at the ACM/SIGCAS conference on Computers and the Quality of Life, September, 1990).

Appendix: ACM Code of Ethics and Professional Conduct


Preamble: Commitment to professional conduct is expected of every member (voting members, associate members, and student members) of the Association for Computing Machinery (ACM). Section 1 consists of fundamental ethical considerations; Section 2 includes additional considerations of professional conduct; statements in Section 3 pertain to individuals who have a leadership role; and Section 4 deals with compliance. Each principle intentionally was phrased as both a personal vow and as an ethical imperative. ACM prepares and maintains an additional document with guidelines for interpreting and following the Code.

1. General Moral Imperatives
As an ACM member I will…

1.1 Contribute to society and human well-being.
1.2 Avoid harm to others.
1.3 Be honest and trustworthy.
1.4 Be fair and take action not to discriminate.
1.5 Honor copyrights and patents.
1.6 Give proper credit for intellectual property.
1.7 Respect rights to limit access to computing and communication systems.
1.8 Respect the privacy of others.
1.9 Honor confidentiality.

2. Additional Professional Obligations
As an ACM computing professional I will…

2.1 Strive to achieve the highest quality in the processes and products of my work.
2.2 Acquire and maintain professional competence.
2.3 Know and respect existing laws pertaining to my professional work.
2.4 Encourage review by peers and all relevant groups.
2.5 Give comprehensive and thorough evaluations of computer systems, their impacts, and possible risks.
2.6 Honor contracts, agreements, and acknowledged responsibilities.
2.7 Improve public understanding of computing and its consequences.

3. Organizational Leadership Imperatives
As an organizational leader, I will…

3.1 Articulate social responsibilities of members of the organizational unit and encourage full acceptance of these responsibilities.
3.2 Design and build information systems to enhance the quality of working life.
3.3 Articulate and support proper and authorized uses of organizational computer technology.
3.4 Ensure participation of users and other affected parties in system design, development, and implementation.
3.5 Support policies that protect the dignity of users and others affected by a computerized system.
3.6 Support opportunities for learning the principles and limitations of computer systems.

4. Compliance with the Code
As an ACM member, I will…

4.1 Uphold and promote the principles of this Code.
4.2 Take appropriate action leading to a remedy if I observe an apparent violation of the Code.
4.3 Understand that violation of this code is inconsistent with membership in the ACM.

General Moral Imperatives

Guidelines for Interpreting and Following
the Association for Computing Machinery (ACM)
Code of Ethics and Professional Conduct
(Draft No. 21 – 11/02/91)

The Task Force for the Revision of the ACM Code of Ethics and Professional Conduct prepared these interpretations of the proposed draft Code. The Task Force proposes that periodic updates and extensions of this document be prepared by the ACM Executive Committee or a designated committee.

1. General Moral Imperatives

As an ACM member I will…

1.1 Contribute to society and human well-being.

This principle concerning the quality of life of all people should be interpreted as an obligation to protect fundamental human rights and to respect the diversity of all cultures. An essential aim of computing professionals is to minimize negative consequences of systems for users, including threats to health and safety. In system design and implementation, computing professionals should attempt to ensure that the systems are used in socially responsible ways by assessing whether or not social needs will be met and by avoiding harmful effects to health and welfare. In addition to a healthy social environment, human well-being includes a healthy natural environment. Therefore, computing professionals who design and develop systems should be alert to, and make others aware of, any potential damage to the local or global environment. In order to fulfill this imperative, the computing professional may have to contribute uncompensated time and expense.

1.2 Avoid harm to others.

“Harm” means injury or any negative consequences, thus it prohibits actions within a computer system that result in harm to any of the following: users, the general public, employees, employers. Harmful actions include intentional destruction or modification of files and programs leading to serious loss of resources.

Unintended actions may also lead to harm. In such an event the person or responsible party is obligated to undo or mitigate the negative consequences caused unintentionally or unknowingly. One way to avoid unintentional harm to make choices considerate of all those affected by the decision.

To keep from indirectly harming others, computing professionals must minimize malfunctions by following generally accepted standards for system design and testing. Furthermore, they must attempt to assess the social consequences of systems in order to project the likelihood of any serious harm to others. If system features are misrepresented to users, coworkers, or supervisors, the individual computing professional is responsible for the resulting injury.

In his or her work environment the computing professional has the additional obligation to report any signs of system dangers that might result in serious personal or social damage. If one’s superiors do not act to curtail such dangers, it may be necessary to “blow the whistle” in order to correct the problem or reduce the risk. Before “whistle blowing,” risk must be thoroughly assessed, and it is suggested that advice be sought from other computing professionals. See principle 2.5 regarding thorough evaluations.

1.3 Be honest and trustworthy.

Honesty is an essential component of trust, and without trust, an organization or a profession can not function effectively. The honest computing professional will not make deliberately false or deceptive claims about a system or system design. He or she will offer full disclosure of all pertinent system limitations and errors. A computer professional has a duty to be honest about his or her own qualifications, and about any competing obligations that might lead to conflicts of interest. Furthermore, an ACM member will not misrepresent ACM or its policy in any way.

1.4 Be fair and take action not to discriminate.

The values of equality, tolerance, and respect for others underlie the fairness imperative, as do the principles of equal justice and. Discrimination on the basis of any distinguishable social characteristics such as race, sex, religion, age, disability or national origin will not be tolerated.

Inequities between different groups of people may result from the use of information and technology. A fair society will have equal access to computer resources regardless of race, gender, disability, age, or other such characteristics.

1.5 Honor copyrights and patents.

Violation of copyrights, patents, and the terms of license agreements is prohibited. Copies of software should be made only with proper authorization; unauthorized duplication of any materials should not be condoned.

1.6 Give proper credit for intellectual property;

Computing professionals are obligated to protect the integrity of intellectual property. Specifically, one should not steal ideas or take credit for other’s work.

1.7 Respect rights to limit access to computing and communication systems.

Theft or destruction of tangible and electronic property is prohibited by imperative 1.2 – “Avoid harm to others.” Trespassing and unauthorized use of a computer or communication system is addressed by this imperative. Individuals and organizations should have the right to restrict access to their systems so long as they do not violate the discrimination principle (see 1.4). No one should enter or use another’s computer system, software or data files without permission. One must always get appropriate approval before using another’s system resources such as computer time.

1.8 Respect the privacy of others.

Computers and related information technology enable the collection and exchange of personal information. Consequently computer systems sometimes increase the potential for violating the privacy of specific individuals and groups. Many codes specify the responsibilities of professionals to maintain the privacy and integrity of data describing individuals.

In March 1991 the ACM Council passed a “resolution on legislative guidelines to protect privacy of individuals.” Computer professionals should ensure that systems do not diminish individual privacy, which encompasses actions that maintain the reliability and accuracy of any personal information held in a system. Furthermore, procedures should be established that allow individuals to review their records and correct inaccuracies. This imperative requires that only a minimal amount of personal information be collected in a system, that retention and disposal periods for that information be clearly defined, and that personal information gathered for one purpose not be used for another purpose without the person’s consent. These principles apply to electronic communications, including electronic mail, and prohibit procedures that capture or monitor electronic messages without the permission of users.

1.9 Honor confidentiality.

The principle of honesty extends to issues of confidentiality of information whenever one has made a promise to honor confidentiality. The general ethical concern is to respect all promises of confidentiality to employers, clients, and users. However, one is discharged from this obligation if the law requires that the confidential information be divulged.

Additional Professional Obligations

2. Additional Professional Obligations
As an ACM computing professional I will…

2.1 Strive to achieve the highest quality in the processes and products of my work.

Excellence is perhaps the most important obligation of a service profession. The computing professional must strive to achieve quality and to be cognizant of the serious negative consequences that may result from inadequacies in a system.

2.2 Acquire and maintain professional competence.

Excellence in a self-regulating profession depends upon individuals who take responsibility for acquiring and maintaining competence. In addition, individual members should participate in setting common standards for appropriate levels of competence. Upgrading technical knowledge and competence can be achieved in several ways: independent study, seminar and course attendance, and involvement in professional organizations.

2.3 Know and respect existing laws pertaining to my professional work.

Since computer professionals live in societies with legal systems, they should respect and obey existing local, state, national, and international laws unless there is an ethical basis not to do so. Policies and procedures of the organizations in which one participates should also be obeyed. But compliance must be balanced with the recognition that sometimes existing laws and rules can be immoral and, therefore, must be challenged. Thus violation of a law or regulation may be ethical when that law or rule has inadequate moral basis. If one decides to violate a law or rule because it is viewed as unethical, one must fully accept responsibility for one’s actions and for the consequences.

2.4 Encourage review by peers and all relevant groups.

The advancement of a professional community depends upon extensive review and critique of one another’s work. Individual members should seek and utilize peer review of ideas their tangible outcomes. Fulfilling this obligation may result in professional enrichment, help reduce system problems, and improve quality.

2.5 Give comprehensive and thorough evaluations of computer systems, their impacts, and possible risks.

Computer professionals should strive to be perceptive, thorough, and objective when evaluating, recommending, and presenting system descriptions and alternatives. Computer professionals are in a position of special trust, and therefore have a special responsibility to provide objective and well-grounded evaluations to employers, clients, users, and the public. To avoid conflicts of interest, any relevant sources of non-objectivity should be disclosed.

As noted in the discussion of principle 1.2 on avoiding harm, any signs of danger from systems should be reported to those appropriate to their resolution. Additional action may also be necessary, but before “whistle blowing,” risk must be thoroughly and comprehensively assessed.

2.6 Honor contracts, agreements, and acknowledged responsibilities.

Honoring one’s commitments is a matter of integrity and honesty. For the computer professional this includes ensuring that system elements perform as intended. When one contracts for work with another party, there is an obligation to keep that party properly informed about progress toward completing that work.

A computing professional has a responsibility to turn down any assignment that he or she feels cannot be completed as defined. Only after serious consideration and with full disclosure of risks and concerns to the employer or client, should one accept the assignment. The major underlying principle here is the obligation to accept personal accountability for professional work. On some occasions, other ethical principles may take greater priority.

2.7 Improve public understanding of computing and its consequences.

Computing professionals have a responsibility to share technical knowledge with the public by encouraging general understanding of computing, including the consequences of computer systems and their limitations. This general imperative implies an obligation to counter any false views related to computing.

Organizational Leadership Imperatives

3. Organizational Leadership Imperatives

As an organizational leader, I will…

This section draws extensively from the draft IFIP Code of Ethics, especially its sections on organizational ethics and international concerns. The ethical obligations of organizations tend to be neglected in most codes of professional conduct, perhaps because these codes are written from the perspective of the individual member. This dilemma is addressed by stating these imperatives from the perspective of the organizational leader. In this context “leader” is viewed as any organizational member who has leadership or educational responsibilities. These imperatives generally may apply to organizations as well as their leaders.

3.1 Articulate social responsibilities of members of the organizational unit and encourage full acceptance of these responsibilities.

The impact of organizations on the public yields a set of responsibilities to the community and the society. Procedures and attitudes oriented toward quality will reduce harm to members of the public, thereby serving public interest and fulfilling social responsibility. Therefore, organizational leadership should encourage full participation in fulfilling social responsibilities as well as quality performance.

3.2 Design and build information systems to enhance the quality of working life of employees.

Organizational leaders have the responsibility to ensure that computer systems enhance, not degrade, the quality of working life. When implementing a computer system, organizations should consider the personal development, physical safety, human dignity, and human fulfillment of all workers. Appropriate human-computer ergonomic standards should be considered in the system design and the workplace.

3.3 Articulate and support proper and authorized uses of an organization’s computer technology.

Because computer systems can become tools to harm as well as to benefit an organization, the leadership has the responsibility to clearly define appropriate and inappropriate uses of organizational computing resources. The number and scope of such rules should be minimized but fully enforced when established.

3.4 Ensure participation of users and other affected parties in system design, development, and implementation.

Present system users, potential users and other involved persons should be directly involved as collaborators in all phases of the system development cycle. Cooperation and collaborative work should be encouraged among all groups involved in designing and using the system.

3.5 Support policies that protect the dignity of users and others affected by a computerized system.

Designing or implementing systems that deliberately or inadvertently demean individuals or groups is ethically unacceptable. Computer professionals who are decision makers should verify that systems are designed and implemented to protect personal privacy and enhance personal dignity.

3.6 Support opportunities for learning the principles and limitations of computer systems.

This complements the imperative on public understanding (2.7). Educational opportunities are essential to facilitate optimal participation of all organizational members. Learning opportunities which should be available to all members to improve their knowledge and skills in computing include information about the consequences and limitations of particular types of systems. Specific limitations include those arising from specific assumptions, simplification of underlying models, complexities, and the improbability of anticipating every possible operating condition.

Compliance with the Code

As an ACM member I will…

4.1 Uphold and promote the principles of this Code.

The future of the computing profession as a whole depends upon both technical and ethical excellence. It is important for the individual computing professional to adhere to these ethical principles and to actively encourage others to do so.

4.2 Take appropriate action leading to a remedy if I observe an apparent violation of the Code.

The appropriate action upon observing a Code violation begins with gathering evidence to determine if a violation occurred, and if so, its degree of severity. The individual may wish to consult with other ACM members in this investigation. Once a determination has been made, discussion of the nature of the problem with the apparent violator is appropriate. If the problem can not be resolved otherwise, it should be reported according to the policy specified in the Policies and Procedures of the ACM section 8.3.5.

4.3 Understand that violation of this code is inconsistent with membership in the ACM.

Adherence of professionals to a code of ethics is for the most part a voluntary matter. If a member does not accept or follow this code, it should be understood that termination of membership is appropriate.

The Task Force for the Revision of the ACM Code of Ethics and Professional Conduct: Ronald E. Anderson (Chair), Peter Denning, Gerald Engel, Donald Gotterbarn, Grace Hertlein, Alex Hoffman, Bruce Jawer, Doris Lidtke, Joyce Little, Dianne Martin, Donn Parker, Judith A. Perrolle.

Facing the Computer Ethics Dilemma


Advancements in computer technology over the past twenty years have created ethical dilemmas, some similar to other professions and some unique to the computer field. Because of the questions that have been raised, and in some instances sensational news accounts of computer irregularities, including fraud, there is a growing perception that self-regulation may be the only means by which the computer professional associations will prevent governments from intervening to regulate the computer profession. In light of recent reports of computer abuse, the adequacy of ethical codes of conduct developed by computer professional societies is assessed. The relationship between professional codes of conduct and computer ethics education is examined.

Computer Ethics in the Social Context

Basic ethical values are learned in the formative years of childhood in the home, church and school. To properly apply the notion of ethics to technology, we must first recognize that technology is not value-free, but value-laden. “Any technological decision… is a value-based decision that not only reflects a particular vision of society but also gives concrete form to it (Christenson, 1986).” Computers can alter or create relationships between people and organizations, even where there may have been no prior connection. Data communications can take place without any personal contact and at such high speed that the individual may not have time to consider the ramifications of a particular transmission. Electronic information is also far more fragile than “hard-copy” paper information.

New ethical dilemmas with competing rights and values have arisen due to the advent of high-speed, worldwide transmission; low-cost, mass storage; and multiple-copy dissemination capabilities. Our understanding of proprietary rights, residual rights, plagiarism, piracy, eavesdropping, privacy, and freedom of expression should be examined and perhaps redefined. Advancements in computer technology were made under the assumption that efficiency was the main concern, not moral values. The time has come to integrate an ethical dimension into the concept of managing technology and the human relationships that accompany technological advancements.

Professional Codes of Ethics and Conduct

“When considering the issue of ethical… behavior in the work setting… a basis for ethical behavior can be found in the context of business as a social institution. Second, a rationale for ethical behavior can be obtained from guidelines implied in the notion of professionalism (Truath, 1982, p.17).” To determine the ethical standards recognized by computer professionals, the existing ethics codes of four major computer professional associations were compared. The codes of ethics shown in Tables 1 – 4 (below) are from the Association for Computing Machinery (ACM), which represents computer scientists; the Institute of Electrical and Electronic Engineers (IEEE), which represents computer engineers; the Data Processing Managers Association (DPMA), which represents managers of computer systems and projects; and the Institute for Certification of Computer Professionals (ICCP), which provides a voluntary certification mechanism for computer professionals.

These ethics codes are not the only statements that the four professional societies have made regarding professional practice. ACM has a set of Disciplinary Rules corresponding to the Ethical Considerations under each Canon in Table 1. “The Canons and Ethical Considerations are not, however, binding rules. Each Disciplinary Rule is binding on each individual Member of ACM. Failure to observe the Disciplinary Rules subjects the Member to admonition, suspension or expulsion from the Association… (Weiss, 1982, p. 183).” IEEE publishes an Ethics Source Sheet (IEEE,1981) outlining the procedures for disciplining members who are alleged to have violated the Code of Ethics. DPMA has published Standards of Conduct that expand on the Code of Ethics shown in Table 3 by providing specific statements of behavior in support of each element of the Code (DPMA, 1989). Regarding these standards, it is stated that “they are not objectives to be strived for, they are rules that no true professional would violate. It is first of all expected that an information processing professional will abide by the laws of their country and community (p. 4).”

ICCP provides an interesting dimension to its ICCP Code of Ethics, Conduct and Practice when it identifies the four essential elements relating to the conduct of a professional. They are stated to be “a high standard of skill and knowledge, a confidential relationship with people served, the public reliance upon the standards of conduct and established practice, and the observance of an ethical code (ICCP, p.1).” To strengthen the professional status of certified computer professionals, ICCP presents what it calls a “fundamental” Code of Conduct covering disclosure, social responsibility, expert opinions, identification of personal qualifications, integrity, conflict of interest, accountability and protection of privacy.

Comparison of the ethical codes of the ACM, IEEE, DPMA, and ICCP produce a number of common themes that emerge as the core of ethical behavior for all computer professionals. The themes shown in Table 5 are: (1) personal integrity / claim of competence, (2) personal responsibility for work, (3) responsibility to employer/client, (4) responsibility to profession, (5) confidentiality of information, (6) conflict of interest, (7) dignity /worth of people, (8) public safety, health, and welfare, (9) participation in professional societies, and (10) increase public knowledge about technology. It is encouraging that in all of the ethics codes of the computer professional societies there is an emphasis on the relationship and interaction of the computer professional with other people, rather than with machines. This properly places the focus of ethical behavior upon ethical or right dealings with people, rather than upon the technology.

One reason that the four codes are not only similar to each other, but also very similar to codes of non-computer professionals is that they take a generic approach to ethics. With the exception of the concern raised about privacy and the confidentiality of data, the codes could have been written to cover most professions and do not fully reflect the unique ethical problems raised by computer technology. Of the four codes, the ICCP code is most clearly geared to the computer profession and deserves further discussion.

Because the main focus of ICCP is certification, the ICCP Code of Conduct is primarily concerned with the competence and, therefore, the credibility of computer professionals. Moreover it is heavily oriented toward an attempt to enforce standards of competence and accountability. For instance, the Preamble to the Code of Conduct confines its reach to “matters pertaining to personal actions of individual certified computer professionals in situations for which they can be held directly accountable without reasonable doubt (ICCP, p. 2),” Such language is subject to several interpretations and thus will keep lawyers in business for years. It then explains the obligations of certified computer professional under sections that appear to encourage or prohibit certain conduct in somewhat limited ways.

For example, being socially responsible is defined in section 2.2 as combating ignorance about information processing technology. Integrity in section 2.5 is limited to not “knowingly” claiming competence one does not possess. The prohibitions to conflict of interest in section 2.6 extend only to insuring that independent advice is provided; if the advice provided is “potentially influential to one’s personal benefit, then full disclosure is mandated.” This section also includes the peculiarly pointed provision that one will not denigrate the honesty or competency of another with intent to gain an unfair advantage. When considering the total breadth of potential conflicts of interests that could be mentioned or applied, these few provisions seem quite narrow. The last two sections dealing with accountability and the protection of privacy appropriately emphasize the unique features of computer professionals with regard to information handling and use.

In addition the ICCP Code of Ethics provides a “Code of Good Practice that is expected to be amended from time to time to accommodate changes in the social environment and to keep up with the development of the computer profession ( p. 1).” The Code of Good Practice provides more detail regarding continuing self education, personal conduct, competence, false statements, discretion, conflict of interest and reporting violations. It returns to the traditional notion of a conflict of interest by requiring full disclosure prior to taking a position which is likely to conflict with one’s current duties. In an attempt to establish a self-policing mechanism among computer professionals,the last provision of the Code of Good Practice states that computer professionals are expected to report violations of the Code, testify in ethical proceedings where one has expert or firsthand knowledge and serve on panels to judge complaints of violations of ethical conduct. ICCP has also established a procedure for revoking certification based upon violations of the ethics code, similar to procedures used by IEEE.

Proposed IFIP International Code of Ethics

Recently, the International Federation of Information Processing (IFIP) has undertaken the ambitious task of developing an international code of ethics for information technology professionals. In attempting to cut across international boundaries in a multi-cultural arena, the proposed IFIP ethics code goes far beyond ethics into the areas of international law and cultural values. At the individual level it covers some of the same issues of the four previous codes. Excerpts from the proposed IFIP Code of Ethics is shown in Table 6.

The code promotes social responsibility through the assessment of social consequences, stresses the protection of established cultural and ethical norms of privacy, defines individual integrity as honesty, probity, objectivity, and trustworthiness in human relations, promotes professional competence, and calls for personal accountability for quality and effects of work done. Nine of the ten common ethics themes found in previous ethics codes are found, either directly or indirectly, in the new IFIP code as shown in Table 7a. Eight additional ethical themes not found in previous codes are raised in the proposed IFIP code and are shown in Table 7b. They are (1) specific statement of social responsibility, (2) establishment of standards, (3) emphasis on quality of life, (4) protection of intellectual property, (5) consequences of networks, (6) basic human rights, (7) rights of the user and (8) equity.

The second section covering International Organizational Ethics is an aspirational statement delineating the voluntary obligations of the international community of computer professionals related to professional standards, certification standards, the quality of working conditions, and user participation and feedback. The third section on Ethics for International Legal Informatics is a call for the development of international law to protect intellectual property rights, to establish legal obligations regarding privacy and other public law matters, to regulate and protect telecommunications networks, and to establish international computer crime law. The final section called International Public Policy Ethics is a utopian statement of desirable human values regarding freedom of communication, the privacy and dignity of individuals, humanized information systems, universal computer literacy, equitable opportunity for information services, and the cultural quality of life.

Ethics in the Context of a Profession

Other sciences and professions that have had hundreds of years to develop ethical concepts continue to wrestle with new and troublesome ethical problems raised by technological advances. Therefore, it is not surprising that a comparatively new field of knowledge, such as computer science, will experience problems in developing ethical concepts and practices. For example, medicine and law are well-defined professions with limited membership. Although they both contain several highly visible public issues, ethical decisions involving their practitioners are made out of the public view for the most part. Some would argue that this process takes place in a self protective manner for the good of the profession.

The computer field, on the other hand, involves many more people and professions from widely diverse situations. The application of computer ethics is made more complicated because there are computer users at all levels throughout our society. Twenty years ago computers were not nearly so numerous or networked together as they are today. Individuals who controlled computers functioned strictly as computer professionals or computer scientists serving other people by providing them with computer output. Now, because of the widespread use of computers, distinguishing between specialists who work only with computers and those who use them as tools for other disciplines lacks significance. “Computers have become as commonplace as telephones. The related ethical issues have thus become more democratically defined. More people have more to say about computer ethics simply because so many… people are computer-literate… the diffuseness of the impacts and the wide distribution of the technology mean that recognizing impacts, let alone solving an ethical dilemma, is much more difficult… Ethical principles applied to millions of computer users effectively become the equivalent of common law (Parker, et al., 1988. p. 3).” For this reason Parker et al has identified four computer specific ethical issues related to new ways of viewing information processing, assets, instruments of acts and symbols of intimidation that need to be addressed in computer ethics codes. These issues are described in Table 9.

An example of a code of ethics that is both computer and profession specific is the Code of Ethics of the International Society for Technology in Education (ISTE), the society that represents over 12,000 computer-using educators at all levels of education. The ISTE Code of Ethical Conduct is based upon principles in nine areas related to use of computers in education and provides rules of conduct in each area. The preamble to the Code of Ethics shown in part in Table 8 reiterates the “importance of people” theme: “Educators should believe in the essential importance of knowledge, morality, skill, and understanding to the dignity and worth of human beings, individually and collectively. As an educator using computers… I will use computers… only in ways that promote the dignity and worth of the learners (ISTE, 1987, p. 51).”

Ethics Without Sanctions

Although the major professional societies have developed codes of ethics, they have been criticized for failing to establish sanctions, enforce them or test their applicability in the real world. Because the codes have been so rarely applied to actual situations, they have not undergone the years of interpretation and practical analysis to which ethics codes in other professions have been subjected. Instead the legal system is being used to settle an increasing number of issues related to computers. Since 1958 there have been over 2,500 reported cases of intentionally caused losses in which computers played a major part or were essential to the scheme (Parker, et al., 1988, p. 2). This situation has prompted the enactment of computer crime statutes in most states as well as two federal laws, the Computer Fraud and Abuse Act of 1986 and the Electronic Communications Privacy Act of 1986. Legality, however, falls far short of what is required for high standards of ethical conduct and awareness.

Leaders in the computer field need to recognize the ethical conflicts faced by computer professionals and to establish ethical standards that are practicable in both the computer science and business communities. Brian Kocher (1989), past President of the ACM, stated that computer professionals must start to police themselves with licensing and certification standards established by the professional societies, or else lawmakers would wrest that prerogative from them by enacting ill-conceived legislation to regulate their activities. Hoffman (1988), a computer security expert, suggested that professional computer users, like automobile drivers, must be licensed if they intend to use their computers in other than a stand-alone mode in their own home or office. This is similar to the on-road versus farm use of motor vehicles. Computer professionals would have their license revoked if they turned from computer user to computer abuser.

Such suggestions of licensing or self-policing are not merely hypothetical musings. There are ample precedents of industries or professions where, given the opportunity to engage in self-policing and having failed to effectively mount adequate programs, the government has used a legislative sledgehammer to force a change. The most recent and highly visible example has been in the government procurement arena, in particular in the defense procurement industry. Concerned with recurring scandals in the defense industry, former President Reagan appointed a Blue Ribbon Panel on Defense Procurement to advise him on how to bring under control the excesses incurred by government contractors that implicated corporate management at the highest levels.

The cornerstone of the Blue Ribbon Panel’s recommendations was a self-policing program that included the following recommendation: when the company, through its own programs, uncovered fraud, waste, abuse and mismanagement, it should voluntarily disclose such things to the government. This program of self-policing was not particularly successful because of the fear of civil or criminal proceedings that might flow from such disclosure. Complaints continued, and Congress passed legislation to require mandatory ethics training for defense contractors and for their government counterparts. Also required was a certification with each contract that the defense contractor is familiar with the laws and regulations relating to government contracting and that they have no information concerning a “violation or possible violation of the laws and regulations (Public Law 100 – 679, 1988).” A false certification is subject to criminal penalties.

Computer Ethics Education

Computer education now begins in elementary school and is not longer a restricted technical specialty learned only by those who are going to design or program computers. Because of the widespread prevalence of computers in society, a core of ethical precepts relating to computer technology should be communicated not only to computer professionals, but to the general public through all levels of education. The issue should be viewed from the perspective of society as a whole as well as from the perspective of computer professionals.

In looking at the ISTE ethics code there is a great emphasis upon incorporating ethical and social impact issues throughout the curriculum starting at the point when children first become computer users in school. In particular, there are a set of guidelines regarding what students in general need to know about computer ethics in Principle V dealing with Student Issues. Incorporating the ISTE guidelines throughout K – 12 education would help to address the “society as a whole” issue of computer ethics.

The preparation of future computer professionals should be examined at both the high school and university computer science curriculum. The ACM is in the process of developing new recommendations at both levels of curriculum. In the high school curriculum, there will be both general and specific approaches to ethics and social impact issues. The general approach is to incorporate these concerns across the curriculum, not just in computer courses. This is in keeping with the philosophy that computers should be integrated across the curriculum as a tool for all disciplines. The specific approach is to develop social impact modules within the computer courses that will focus on these concerns.

At the university level the ACM faces a yet-to-be resolved dilemma of how to implement the proposed societal strand in the new curriculum recommendations. There is much discussion, but little action, regarding the necessity of preparing ethically and socially responsible computer scientists, especially in light of the highly publicized computer viruses that are an embarrassment to the profession. To this end the ACM has articulated a tenth core strand – ethical and social impact – that must now be incorporated in computer science programs. The Computer Science Accreditation Board (CSAB) which has accredited over 50 institutions since it was established in 1984, requires instruction in the social implications of computing as a criterion for accreditation.

The dilemma is whether this new strand should be present in all computer science courses or should be taught in a stand-alone course. CSAB allows the topic to be taught as a separate course or to be included as a component of other courses. If it is a stand-alone course, should it be required or elective? Many feel that the across-the-board approach is the best, but cynically question whether you can really “teach old dogs new tricks.” Joseph Weizenbaum, a professor of Computer Science at the Massachusetts Institute of Technology, favors the M. I. T. approach of including discussions of ethics in the context of other computer science courses already in the curriculum to eliminate the tendency of professors “to skip over ethical considerations with the excuse that it is taught in Ethics 101 (DeLoughry, 1989).” However, he recognizes the possibility that the ethics material could receive short shrift in a crammed technical syllabus, as is alleged to occur in many law schools. When combined with other computer science core material, the teaching of ethics is made complicated by the fact that it is not as concrete as the rest of the curriculum. How do we persuade “hard core” computer scientists that social impact material is serious and involves long range implications for the future of computer science?

In accepting the value-laden nature of technology, we should recognize the need to teach a methodology of explicit ethical analysis in all decision-making related technology. We can borrow from the strategy of traditional university ethics courses to use case studies (Parker, 1988; Weiss, 1982), readings and discussions in our computer ethics courses. We must teach our students to use the preliminary core of ethical concepts developed by the computer professional societies to first deal with hypothetical cases in order to prepare them to deal with real ethical dilemmas in the future. One method is to answer the five questions in ethics suggested by bio-ethicist Robert Veatch (1977), that when asked collectively and in sequence, form a general framework for addressing and providing justification for moral dilemmas: (1) What makes right acts right? (2) To whom is moral duty owed? (3) What kinds of acts are right? (4) How do rules apply to specific situation? (5) What ought to be done in specific cases? (Veatch, 1977, p. 2).

In a recent ethics workshop Professional Engineer John McLeod suggested another set of generic ethical questions to be asked by individuals in the context of daily professional practice: (1) is it honorable, (2) is it honest, (3) does it avoid the possibility of a conflict of interest, (4) is it within your area of competence, (5) is it fair, (6) is it considerate, and (7) is it conservative of time and resources. Questions such as these can be used effectively to train students to apply ethical standards to both hypothetical and real situations.

The challenge to computer educators is to develop strategies that will raise the awareness of students regarding ethical and moral issues related to computer technology at the same time that they are developing their technical expertise. We should not delude ourselves into thinking that simply teaching about ethics will be a panacea for the problems now faced by society due to computer technology, but we should demonstrate our commitment to ethical behavior by incorporating ethics education into computer education at all levels.


Computer technology is particularly powerful due to its potential to change how we think about ourselves as human beings, how we make decisions in governance and social policy, and how we save and communicate knowledge. Yet, our analysis of the professional codes of conduct reveals that they are still inadequate to deal with emerging technological issues resulting from advancements in the computer field. There appears to be a lack of focus in the computer field in integrating ethical behavior into professional practice. While not wishing to be alarmists, we are suggesting that there needs to be a concerted effort on the part of the all the computer professional societies to revise their ethical codes and to incorporate a process of continual self-assessment with formal procedures for reporting suspected improper practices, the availability of due process considerations, and the use of sanctions and possible disciplinary actions (IEEE, 1981).

Because of the sensational media reporting of computer-related irregularities and because of the possible far-reaching consequences of computer abuse, the computer field is coming under increasing scrutiny at all levels of government. To prevent the government from imposing inflexible regulations that might retard computer research and development, the professional societies should take proactive measures toward self-regulation. Since ethical standards are by their very nature “normative,” our precepts for computer ethics will change as new ethical challenges arise from new computer technology. The fact that we are discussing ethics in the context of human-human and human-machine interactions
will require some innovative ways to apply ethical principles, but it is necessary task to be undertaken if we are to mature into a true profession.

The George Washington University


Kathleen E. Christensen, (1986) “Ethics of Information Technology” in Gunther Geiss and Narayan Viswanathan, eds., The Human Edge: Information Technology and Helping People. Haworth Press.

Thomas J. DeLoughry, (1988) “Failure of Colleges to Teach Computer Ethics is Called Oversight with Potentially Catastrophic Consequences,” The Chronicle of Higher Education,February 24, 1988, A15.

DPMA, (revised January 1989) DPMA Position Statement Handbook, DPMA, 505 Busse Highway, Park Ridge, IL 60068.

Lance Hoffman, (1988) “Is There a Computer Chernobyl in Our Future?” Testimony before the Subcommittee on Criminal Justice of the Committee on the Judiciary of the House of Representatives, U.S. Congress. Nov., 8, 1989.

ICCP, ICCP Code of Ethics, ICCP, 2200 E. Devon Avenue, Suite 268, Des Plaines, IL 60018.

IEEE, (1979) IEEE Code of Ethics, IEEE, 345 East 47th St., New York, NY 10017.

IEEE, (1981) Ethics Source Sheet, IEEE, 345 East 47th St., New York, NY 10017.

ISTE, (1987) “Code of Ethical Conduct for Computer-Using Educators,” The Computing Teacher, Vol. 15, No. 2, pp. 51 – 53. (ISTE, University of Oregon, 1787 Agate Street, Eugene, OR 97403-9905).

Brian Kocher, (1989) “President’s column,” Communications of the ACM, Vol. 32, No. 6.

Donn B. Parker, Susan Swope and Bruce N. Baker, (1988) Ethical conflicts in information and computer science, technology and business: final report (SRI Project 2609). SRI International, 333 Ravenswood Ave., Menlo Park CA 94025.

Public Law 100 – 679, Procurement Integrity, section 6 of The Office of Federal Procurement Policy Act Amendments of 1988.

Eileen M. Trauth, (1982) “The Professional Responsibility of the Techknowledgable,” ACM Computers & Society Newsletter, vol.13, No. 1, pp. 17 – 21.

R. Veatch, (1977) Case studies in medical ethics, Harvard University Press.

Eric Weiss, ed., (1982) “Self assessment procedure IX: a self-assessment procedure dealing with ethics in computing,” Communications of the ACM, Vol. 25, No. 3, pp. 183 – 195.

Appendix: Codes of Ethics of Various Organizations

Table 1: ACM Canons of Conduct

Table 1: ACM Canons of Conduct

Preamble: Recognition of professional status by the public depends not only on skill and dedication but also on adherence to a recognized Code of Professional Conduct. The following Code sets forth the general principles (Canons) followed by professional ideals (Ethical Considerations)… applicable to each member. An ACM member shall:

Canon 1. Act at all times with integrity:

EC1.1… properly qualify himself in areas of competence.
EC1.2… preface any partisan statement about information processing by indicating clearly on whose behalf they are made.
EC1.3… act faithfully on behalf of employers or clients.

Canon 2. Strive to increase competence and prestige of profession:

EC2.1… encouraged to extend public knowledge, understanding, and appreciation of information processing, and to oppose any false or deceptive statements relating to information processing of which his is aware.
EC2.2… not use professional credentials to misrepresent his competence.
EC2.3… shall undertake only those professional assignments and commitments for which he is qualified.
EC2.4… strive to design and develop systems that adequately perform the intended functions and that satisfy employer’s or client’s operational needs.
EC2.5… maintain and increase competence through a program of continuing education encompassing the techniques, technical standards, and practices in his field of professional activity.
EC2.6… provide opportunity and encouragement for professional development and advancement of both professionals and those aspiring to become professionals.

Canon 3. Accept responsibility for own work:

EC3.1…accept only those assignments for which there is a reasonable expectancy of meeting requirements or specifications, and shall perform assignment in a professional manner.

Canon 4. Act with professional responsibility:

EC4.1… do not use ACM membership for professional advantage or to misrepresent the authority of his statements.
EC4.2… conduct professional activities on a high plane.
EC4.3… encouraged to uphold and improve professional standards of the Association through participation in their formulation, establishment, and enforcement.

Canon 5. Use special knowledge and skills for advancement of human welfare:

EC5.1… consider health, privacy, and general welfare of public in performance of work.
EC5.2… whenever dealing with data concerning individuals, always consider principle of the individual’s privacy and seek the following:

  • to minimize the data collected.
  • to limit authorized access to the data.
  • to provide proper security for the data.
  • to determine the required retention period of the data.
  • to ensure proper disposal of the data.

Table 2: IEEE Code of Ethics

Table 2: IEEE Code of Ethics

Preamble: Engineers, scientists and technologists affect the quality of life for all people in our complex technological society. In the pursuit of their profession, therefore, it is vital that IEEE members conduct their work in an ethical manner so that they merit the confidence of colleagues, employers, clients and the public. This IEEE Code of Ethics represents such a standard of professional conduct for IEEE members in the discharge of their responsibilities to employees, to clients, to the community, and to their colleagues in this Institute and other professional societies.

Article I. Members shall maintain high standards of diligence, creativity and productivity, and shall:

  1. Accept responsibility for their actions;
  2. Be honest and realistic in stating claims or estimates from available data;
  3. Undertake technological tasks and accept responsibility only if qualified by training or experience, or after full disclosure to their employers or clients of pertinent qualifications;
  4. Maintain their professional skills at the level of the state of the art, and recognize the importance of current events in their work;
  5. Advance the integrity and prestige of the profession by practicing in a dignified manner and for adequate compensation.

Article II. Members shall, in their work:

  1. Treat fairly all colleagues and coworkers, regardless of race, religion, sex, age or national origin;
  2. Report, publish and disseminate freely information to others, subject to legal and proprietary restraints;
  3. Encourage colleagues and coworkers to act in accord with this Code and support them when they do so;
  4. Seek, accept, and offer honest criticism of work, and properly credit the contributions of others;
  5. Support and participate in the activities of their professional societies;
  6. Assist colleagues and coworkers in their professional development.

Article III. Members shall, in their relations with employers and clients:

  1. Act as faithful agents or trustees for their employers or clients in professional and business matters, provided such actions conform with other parts of this Code;
  2. Keep information on business affairs or technical processes of an employer or client in confidence while employed, and later, until such information is properly released, provided that such actions conform with other parts of this Code;
  3. Inform their employers, clients, professional societies or public agencies or private agencies of which they are members or to which they make presentations, of any circumstance that could lead to a conflict of interest;
  4. Neither give nor accept, directly or indirectly, any gift payment or service of more than nominal value to or from those having business relationships with their employers or clients.
  5. Assist and advise their employers or clients in anticipating the possible consequences, direct or indirect, immediate or remote, of the projects, work or plans of which they have knowledge.

Article IV. Members shall, in fulfilling responsibilities to community:

  1. Protect safety, health, and welfare of public and speak out against abuses in these areas affecting the public interest;
  2. Contribute professional advice, as appropriate, to civic, charitable or other nonprofit organizations;
  3. Seek to extend public knowledge and appreciation of the profession and its achievements.

Table 3: DPMA Code of Ethics

Table 3: DPMA Code of Ethics

I acknowledge:

  1. That I have an obligation to management, therefore, I shall promote the understanding of information processing methods and procedures to management using every resource at my command.
  2. That I have an obligation to my fellow members, therefore I shall uphold the high ideals of DPMA as outlined in its Association Bylaws. Further, I shall cooperate with my fellow members and shall treat them with honest and respect at all times.
  3. That I have an obligation to society and will participate to the best of my ability in the dissemination of knowledge pertaining to the general development and understanding of information processing. Further, I shall not use knowledge of a confidential nature to further my personal interest, nor shall I violate the privacy and confidentiality of information entrusted to me or to which I may gain access.
  4. That I have an obligation to my employer whose trust I hold, therefore I shall endeavor to discharge this obligation to the best of my ability, to guard my employer’s interests, and to advise him or her wisely and honestly.
  5. That I have an obligation to my country, therefore, in my personal business and social contacts, I shall uphold my nation and shall honor the chosen way of life of my fellow citizens.

I accept these obligations as a personal responsibility, and as a member of this Association. I shall actively discharge these obligations and I dedicate myself to that end.

Table 4: ICCP Code of Ethics

Table 4: ICCP Code of Ethics

General Statements:

1.1 Certified computer professionals, consistent with their obligation to the public at large, should promote the understanding of data processing methods and procedures using every resource at their command.

1.2 Certified computer professionals have an obligation to their profession to uphold the high ideals and level of personal knowledge as evidenced by the Certificate held. They should also encourage the dissemination of knowledge pertaining to the development of the computer profession.

1.3 Certified computer professionals have an obligation to serve the interests of their employers and clients loyally, diligently, and honestly.

1.4 Certified computer professionals must not engage in any conduct or commit any act which is discreditable to the reputation or integrity of the data processing professional.

1.5 Certified computer professionals must not imply that the Certificates which they hold are their sole claim to professional competence.

Code of Conduct:

2.1 Disclosure: Subject to the confidential relationships between oneself and one’s employer or client one is expected not to transmit information which one acquires during the practice of one’s profession in any situation which may seriously affect a third party.

2.2 Social Responsibility: One is expected to combat ignorance about information processing technology in those public areas where one’s application can be expected to have an adverse social impact.

2.3 Conclusions/Opinions: One is expected to state a conclusion on a subject in one’s field only when it can be demonstrated that it has been founded on adequate knowledge. One will state a qualified opinion when expressing a view in an area within one’s professional competence but not supported by relevant facts.

2.4 Identification: One shall properly qualify oneself when expressing an opinion outside ones’ professional competence in the event that such an opinion could be identified by a third party as expert testimony, or if by inference the opinion can be expected to be used improperly.

2.5 Integrity: One will not knowingly lay claims to competence one does not demonstrably possess.

2.6 Conflict of Interest: One shall act with strict impartiality when purporting to give independent advice. In the event that the advice given is currently or potentially influential to one’s personal benefit, a full and detailed disclosure of all relevant interested will be made at the time the advice is provided. One will not denigrate the honesty or competence of a fellow professional or a competitor, with the intent to gain an unfair advantage.

2.7 Accountability: The degree of professional accountability for results will be dependent on the position held and type of work performed…

2.8 Protection of Privacy: One shall have special regard for the potential effects of computer-based systems on the right of privacy of individuals whether this is within one’s own organization, among customers or suppliers, or in relation to the general public…

Table 5: Common Themes in Professional Ethics Codes

Table 5: Common Themes in Professional Ethics Codes

(Canon) (Article) (Obligation) (Statement)
1. Personal integrity/claim of competence 1.1, 2.3, 4.1 Ib, e, IId 2 1.4, 1.5, 2.3, 2.4, 2.5
2. Personal accountability for work 3.1 Ia 2.7
3. Responsibility to employer /client 2.4 IIIa, b, c, d, e 4 1.3
4. Responsibility to profession 2, 4.3 Id, e, IIe, f 2 1.2, 2.4
5. Confidentiality of information / privacy 5.2 IIIb 3 2.1, 2.8
6. Conflict of interest IIIc, d 2.6
7. Dignity/worth of people 5 IIa 2
8. Public safety, health, and welfare 5.1 IVa 5 2.8
9. Participation in professional societies 4.3 Id, IIe 2
10. Increase public knowledge about technology 2.1 IIb, IVa, b, c 1, 3 1.1, 2.2

Table 6: Excerpts from Proposed IFIP Ethics Codes

Table 6: Excerpts from Proposed IFIP Ethics Codes


The IFIP Code of Ethics has been constructed not only for individual Information Technology (IT) professionals but also for multinational organizations and the extended IT community concerned with international legal informatics and related global public policy… The guidelines are global and multicultural and are not intended to reflect any particular ideology or creed.

1. Individual Professional Ethics

1.1 Social Responsibility:

IT professionals strive to use their unique technical expertise to advance international human welfare and the quality of life for citizens of all nations. Computer-related professionals feel an ethical obligation to assess social consequences and to help ensure safe and beneficial us of IT applications.

1.2 Protection of Privacy:

IT professionals have a fundamental respect for the privacy and integrity of individuals, groups, and organizations. They are also aware that computerized invasion of privacy, without informed authorization and consent, is a major continuing threat for potential and individuals, groups and populations. Public trust in informatics is contingent upon vigilant protection of established cultural and ethical norms of information privacy.

1.3 Individual integrity:

IT professionals maintain high standards of personal integrity which is basic for the harmonious development and fulfillment of organizations and society. Individual integrity encompasses personal traits that create a feeling of pride in the individual, such as honesty, probity, objectivity, sensitivity to others, and trustworthiness in human relations. They respect and defend the free inquiry of their associates. IT professionals do not misrepresent capabilities, applications and value of information processing systems for their personal gain.

1.4 Professional Competence:

IT professionals are aware of their personal responsibility to continually maintain and upgrade their technical competence in the swiftly changing domain of computer-based information systems. They are cognizant of the capabilities and limitations of their specialized expertise and the field of information processing broadly conceived.

1.5 Personal Accountability:

IT professionals accept personal responsibility for concurred mutual expectations pertaining to their role and work… They attempt to keep all cognizant parties such as coworkers, managers, clients, and users properly informed on the progress and status of their tasks. IT professionals contribute to objective test and evaluation of information system effectiveness to facilitate beneficial social objectives.

2. International Organizational Ethics

2.1 High Performance Standards
2.2 International Standards and Regulations
2.3 International Legal Protection
2.4 Employee Productivity and Quality of Working Life
2.5 User Participation and Feedback

3. Ethics for International Legal Informatics

3.1 Intellectual Property Law
3.2 International Public Law
3.3 International Telecommunications Law
3.4 International Criminal Law

4. International Public Policy Ethics

4.1 Freedom of Communication
4.2 Privacy and Dignity of Individuals
4.3 Humanized Information Systems
4.4 International Computer Literacy
4.5 Equitable Opportunity for Information Services
4.6 Cultural Quality of Life

Table 7a: Common Ethics Themes as Represented in Proposed IFIP Ethics Code

Table 7a: Common Ethics Themes as Represented in Proposed IFIP Ethics Code

Theme: Section in Proposed IFIP
Code of Ethics
1. Personal integrity/claim of competence 1.3, 1.4, 4.3
2. Personal accountability for work 1.5
3. Responsibility to employer/client 1.5
4. Responsibility to profession 1.4 (indirectly)
5. Confidentiality of information/ privacy 1.2, 4.2
6. Conflict of interest 1.3 (indirectly)
7. Dignity/worth of people 2.4, 2.5, 4.2, 4.3
8. Public safety, health, and welfare 1.1, 4.3
9. Participation in professional societies not mentioned
10. Increase public knowledge about technology 4.4, 4.6

Table 7b: New Ethics Themes Found in Proposed IFIP Ethics Code

Table 7b: New Ethics Themes Found in Proposed IFIP Ethics Code

Theme: Section in Proposed IFIP
Code of Ethics
1. Specific statement of social responsibility 1.3, 1.4, 4.3
2. Establishment of standards 2.1, 2.2
3. Emphasis on quality of life 2.1, 2.4, 4.3, 4.6
4. Protection of intellectual property 3.1
5. Consequences of networks 3.3
6. Basic human rights 4.1, 4.2
7. Rights of the user 2.5, 4.3, 4.6
8. Equity 4.5

Table 8: Computer-Specific Ethical Issues (Parker, et al., 1988, p.2)

Table 8: Computer-Specific Ethical Issues (Parker, et al., 1988, p.2)

Computer Issue Concern

1. Repositories/processors of information:

unauthorized use of otherwise unused computer services or of information stored in computers raises questions of appropriateness, fairness, invasion of privacy and freedom of information.

2. New forms and types of assets:

such as algorithms or computer programs that may not be subject to the same concepts of ownership as other assets.

3. Instruments of acts:

degree to which providers of computer services and users of computers, data and programs are responsible for integrity and appropriateness of their computer output.

4. Symbols of intimidation/deception:

anthropomorphic view of computers as thinking machines, infallible truth producers that are subject to blame.

Table 9: ISTE Ethical Code for Computer-Using Educators

Table 9: ISTE Ethical Code for Computer-Using Educators

Principle I. Curriculum Issues – I have some responsibility for defining the roles of computers in the school curriculum and for assessing significant and likely intended and unintended consequences of those roles….

Principle II. Computer Access – I support and encourage policies that extend equitable computer access to all students, and I will actively support well-reasoned programs and policies that promote such use….

Principle III. Privacy/Confidentiality – I have varying degrees of responsibility for the development of policy that guarantees the proper use of computerized and non-computerized information in the school’s possession….

Principle IV. Teacher-Related Issues – In order to redefine the teacher’s role in light of the integration of computers into classrooms, each teacher must have a minimum level of general computer literacy, including skills and knowledge about computers appropriate to the classroom setting and subject area. In addition, each teacher must accept the responsibility to practice as a professional according to the highest ethical standard….

Principle V. Student Issues – One way to measure success is by the progress of each student toward realization of potential as a worthy and effective citizen. To help fulfill this goal, I will:

  1. help students learn about future trends and possible impacts and consequences of a computerized society,
  2. demonstrate respect for computer ethics in the school, which includes not permitting unauthorized duplication of software by my students,
  3. Ensure that students have opportunities to evaluate their current and future roles and the impact their actions can have on future consequences in a computerized society,
  4. help students to evaluate the models which underlie simulations on which major societal decisions are made, and
  5. help students examine issues that relate to computer ethics.

Principle VI. Community Issues – The general community, parents and educators share responsibility for creating learning environments. In fulfilling responsibilities to the community I will:

  1. provide training to the members of the educational or general community when asked and when practical to increase parental and community knowledge of possible educational goals that involve computers… encourage parental involvement in long-term planning of computer use, coordinate expectations for computer use between home and school,
  2. extend the standards for respect of copyright into school/ community interactions, and
  3. evaluate what control donors should have over the use of hardware and software they provide.

Principle VII. School Organization Issues – Effective and efficient use of computers in education requires organizational support.

Principle VIII. Software Issues – I have some responsibility for the acquisition, development and dissemination of software in the school environment.

Principle IX. Hardware Issues – I share responsibility for the quality and improvement of hardware used by educators and students.

Track Report: Computer Privacy and Confidentiality

National Conference on Computing and Values
Report on the Track: Privacy and Confidentiality Report
of the NCCV Working Group on Privacy and Confidentiality

Jacques N. Catudal, Ph.D.

The NCCV Working Group on Privacy and Confidentiality:

Cynthia Alexander Paul Hyland George Nicholson
Daniel Appelman Ernest Kallman Sam Nicholson
Tora Bikson Jillian Kendall Richard Rosenberg
Jacques N. Catudal John Ladd Brad Templeton
Neil Charney Ronald Lancaster Stephen W. Thompson
Dave Colantonio Blaise W. Liffick Arnold B. Urken
Clifford Collins Pierre Mackay Richard G. Vance
Joanne Costello Claire McInerney Willis H. Ware


These collective reflections on privacy, confidentiality and computers are necessarily brief, impressionistic and incomplete. Given the vastness of the topic, and the requirement to produce concrete recommendations by the close of the Conference, the Working Group focused discussion on three areas: (1) conceptual ambiguities frustrating a fuller and more useful understanding of the very concepts of “privacy” and “confidentiality;” (2) the use and abuse, morally speaking, of electronic mail and electronic bulletin boards; and (3) the development and sale of databases, particularly as the latter affect the lives and well-being of private citizens. Throughout, the principal constituencies needing to be concerned by problems of privacy and confidentiality were identified as ordinary U.S. citizens, including the economically disadvantaged and uneducated; educational institutions; private corporations; and government agencies. Across all constituencies, database owners, publishers and users, as well as software developers and system managers bear a special relation to the problems. However, problems of privacy and confidentiality affect all Americans.

Seeking a Clearer Understanding of “Privacy” and “Confidentiality”

What assumptions are we making in discussing privacy and confidentiality in the context of computer technologies, particularly as computer privacy relates to computer security? We assume that without a desire for privacy the desire for security makes little sense. On the other hand, even if we assume a totally secure system, privacy problems don’t go away. For purposes of getting a better understanding of privacy and confidentiality, it might be useful to assume that all systems are completely secure, thereafter determining the nature of the privacy problems that remain. Of course, we should do this without losing sight of the fact that in the real world, there can be no such thing as a completely secure system. So, we also need to determine the nature of privacy problems when systems are secure to varying degrees.

Of course, it could also be argued that the security issue is besides the point. Consider Lotus’ Marketplace database: the developers didn’t have to break into any system to acquire the information, and yet it poses a serious threat to privacy. Similarly, physicians and nurses have unrestricted access to patients’ confidential medical records; and problems may arise if physicians and nurses do not treat this information as confidential.

But what do “privacy” and “confidentiality” mean? Why, in moral and social arrangements, are privacy and confidentiality important? Is there a moral right to privacy and, if so, on what philosophical (or other) basis is it founded? In short, what moral arguments can be advanced on behalf of the individual’s right to privacy? Similarly, is there a legal right to privacy and, if so, on what basis is it founded? A recent poll by the Los Angeles Times found that 71% of Americans believed they had a right to privacy. What did those polled understand by “privacy”?

We need to define privacy before we protect it. At first glance, what information is or is not private seems somewhat subjective. For example, some persons may not want their age or salary divulged, while others seem not to care whether such information is public or private. This suggests subjectivism. Further, what information is regarded as appropriately public or private may be a function of the culture to which one belongs. And this suggests relativism.

Is privacy a right? If so, then it is the kind of right that may be selectively exercised or waived. Talking to a doctor about one’s bowel movements involves waiving one’s right to personal privacy. We waive the right in such instances because it is in our best interest to do so. This too suggests that there is an element of relativism involved in our exercising or waiving the right to privacy. However, it should be noted that in most such cases, the assumption of confidentiality is an important aspect of the disclosure.

While examples can be offered suggesting that privacy may be a subjective matter (again, whether one wants one’s age or salary made public), such examples cannot address the more fundamental question of whether it is the case that everyone wants to control the process in which decisions are made regarding the private or public status of information about one’s self. Further, we should note that when we decide to divulge information to a doctor or lawyer, the confidentiality of such information is protected by law. Here we ought to consider whether it would be a good idea to afford persons similar legal protections regarding the practices, for instance, of credit bureaus.

An example may serve to illustrate the advantages and disadvantages of current credit bureau practices. Consider an individual who, for several years, has done business with a small independent bank that does not report to any credit bureau the financial histories of its customers. One day the individual moves to a larger town and wishes to make a car loan from a larger bank. He is refused the car loan because the new bank is unable to determine the individual’s credit history.

There may be a fallacy in thinking that what Americans want most is privacy. They may in fact want more credit and greater convenience. People do not want to go back to the days when obtaining a car loan took two weeks and securing a mortgage took several months. Of course, Americans may also want to know what information about them is being disseminated. But satisfying both desires need not be an impossibility.

How can privacy and confidentiality be distinguished? Privacy belongs to an individual, and holds between the individual and the world. Confidentiality involves a relationship between two people. In confidential arrangements, there is an implicit agreement between persons that information won’t be passed on, perhaps even an implicit promise. Such personal relationships imply a consent to retain information as well as a measure of trust. In the research arena, for example, a researcher “promises to hold” information. In some cases, the breaking of the promise may be held not only against the individual who breaks the promise, but against the institution for which the individual works. Confidentiality may be construed as a tool we use to assure privacy.

We’ve noted that relations exist between the problem of privacy and the problem of security. It should also be said that relations exist between privacy and ownership, and between privacy and access. Determining the nature and extent these relations is imperative if we are to acquire a clearer and more comprehensive understanding of the nature of privacy and confidentiality in computerized settings. Studies of problems in isolation are bound to be inadequate.

The Use and Abuse of Electronic Mail and Electronic Bulletin Boards

Bulletin board operators and e-mail managers in academic, governmental, and corporate settings are currently confronted by questions of privacy but they have few policies to guide them in framing solutions. For example, a campus bulletin board operator comes across potentially libelous information about a professor. What should she do? An e-mail manager discovers that one of her users is receiving hate mail. She is told by a supervisor to fix the problem but is given no direction amidst potentially conflicting solutions. There is a need for institutional policies regarding privacy and confidentiality. Are messages between employees private or public? Management is often reluctant both to formulate and consistently enforce policies.

When one uses an e-mail service, the assumption should be that all communications are private. In some corporate and academic institutions, the reverse holds true. The argument is that if the institution owns the hardware and software with which the service is provided, it owns all messages that are produced using them. Therefore, it has a right to read, edit, censor, or otherwise alter or destroy any message produced on its premises. Some would view this argument as involving a dazzling non-sequitur. Does it follow that because the corporation owns the telephones that employees use that it can monitor employee telephone conversations? There appear to be good reason for supporting a “default standard” that supports privacy and confidentiality.

Opponents argue that the current default standard is acceptable so long as current and prospective employees are informed that their communications may be read, edited, etc. After all, they retain the choice to work or not work for corporations that invoke the standard. If the employee then chooses to work for the corporation, they do so with a clear understanding of the organization’s policies.

Is there a problem of privacy if employees believe falsely that their communications are not being read by others? Alternatively, is there a problem of privacy if an employer does not inform her employees that she is reading their e-mail? In such cases, it may be useful to distinguish “real” from “apparent” privacy. If I take from your cash box, and you are very wealthy and don’t miss the loss, have you been harmed? Is this a fair analogy?

At some universities, there is a practice of searching student’s electronic files for “unacceptable” items, such as “four-letter words,” “pornographic” depictions, etc. Justification for the practice is sometimes given by appeal to the claim that if the University owns and operates the system, it is entitled access to all information obtained, sent or otherwise transmitted over the system. But would a university invade a student’s dorm room to search for pornographic magazines? The protections afforded students against unreasonable search of their university living quarters and seizure of their personal property apparently does not extend to computer systems. The same issue confronts faculty. Does the university or college have a right to edit, censor, or dump electronic files that contain, from its point of view, “inappropriate” materials? For example, should the university use programs that scan faculty or student files for “four-letter” words?

On the other hand, what should system managers do when they have suspicions about a particular account? For example, what should they do if they discover in a user’s account programs used to illicitly capture other users’ passwords, or actual tools for burgling other people’s accounts? If the manager does nothing and knows, is she guilty of complicity?

These cases beg a distinction between actions taken against an account for the sake of preserving the integrity of the system (as in the burglary tool case), and actions taken against an account in the name of “good taste” (as in the case of obscene language and pornographic depictions).

Much information of a personal and, sometimes, of a critical sort gets passed around on a university’s e-mail system. Criticism of the practices of a colleague, dean or president may or may not be valid; but the right to free expression, even critical expression, is protected by the Bill of Rights. Imagine a program that can scan all occurrences of the university president’s name for the purpose of determining attitudes toward the president. One of our most important goals as teachers is to teach students to respect the rights of others. The use of such a search program is blatant hypocrisy.

There are grievance procedures at most schools that give meaning to the expression “due process,” and that must be adhered to. One should not open another’s computer file unless one has reasonable cause to do so. But criteria enabling us to judge that reasonable cause exists for any given case must be clearly articulated. Further, the judgment that the criteria have been met must not fall to a single individual, a system manager or a university administrator, but must result from consensus.

Some institutions strive to eliminate the distribution of possibly offensive electronic bulletin board messages by having a faculty or staff member screen all messages prior to posting. However, while the practice may succeed in eliminating obscene language, pornographic depictions, and advertisements of the sort discussed at this conference in Leslie Burkholder’s track address (i.e., the Case of the “Filipino Love Slave For Sale,” in which a student advertises for sale the sexual services of a woman), the practice also has the effect of transposing the focus of inquiry to the moral status of the screening practice itself.

At some universities and colleges, students sign a statement which informs them that the files they create should only relate to their courses, and that their files will be reviewed by their teachers. By signing the statement, the student acknowledges that she understands the rules of file creation and management at the institution and agrees to act by their terms.

The problem of “compounded wrongs” must also be considered. Consider the secretary who discovers that there exists information portraying her as wholly incompetent and disreputable, information she discovered by reading others’ e-mail. She storms into her boss’ office and demands that he do something. What should the boss do?

System planners and policy designers should offer administrators several options from which to choose in developing e-mail, bulletin board, and other computerized systems. By the same token, administrators must choose or commit to particular policies, including policies that address privacy and confidentiality. Failure to choose is itself a choice, and a problematic one at that. The problem at many, if not most institutions, whether governmental, commercial, or educational, is that there is no upfront statement of policy regarding conditions of use, penalties for misuse, and grievance and appeals procedures. The absence of such policies represents one of the greatest threats to the integrity of our systems. For where no policies exist, those who would abuse the system may successfully defend their mischief by pointing to the void. Here it may be useful to distinguish causing another harm by exploiting her vulnerabilities, from causing another harm by exploiting her negligence. In 1991, failure to adopt appropriate policies, and to clearly state these policies upfront, may very well constitute a form of negligence.

On the other hand, we recognize that the best of policies cannot prevent those who are determined to get their way. Clearly, some students believe they should have access to all information they are capable of securing from a college or university, whether confidential or not, and that it is the institution’s job to set up appropriate mechanisms for denying access where ever appropriate. (This suggests the spirit of a game – “Catch Me If You Can” or “I Can Do You One Better”.) Are legal and technological measures the only tools at our disposal for handling problems of privacy and confidentiality? A third option is education, i.e., assisting students in reflecting about values of privacy and confidentiality. Educational tools are needed to rid some students of harmful beliefs and attitudes. In this regard, the teaching of computer ethics by computer scientists ought to be regarded as an aspect of their professional responsibilities.

Electronic Databases and Privacy

Cases like Lotus’ Marketplace database illustrate that what is significant about privacy problems as engendered by computer technologies is the significantly greater order of magnitude of the problem. The effects of such databases are dramatically new, occasioning unanticipated problems. Several examples can be offered to show how quickly and widely the dissemination of information can take place. An individual suffers a traffic accident; within a week he receives letters from four lawyers expressing an interest in representing him. Police blotters are published in many local newspapers; now, however, some police departments maintain “E-blotters.” It is now possible for a person in New Haven to find out who was arrested for DWI last evening in some small west coast town. Serious problems may result from incorrect information and from the difficulties associated with securing retraction when such is the case. There is sometimes an element of anonymity or invisibility surrounding the source of the (incorrect) data. The effect is to deny individuals due process and the right to confront their accusers.

So, what seems new about the problems of privacy and confidentiality is (1) the scale of the problem, i.e., number of persons affected and/or, (2) the invisibility of the source, i.e., unaccountability, (3) the number and kind of negative effects generated whenever the data is incorrect, and (4) the difficulty of obtaining retractions, and consequently, the persistence of negative effects. The harm that can be done to people when information is incorrect is dramatically illustrated by the current practices of credit bureaus.

Currently the default standard in matters of information about individuals is such that information that one releases about one’s self may be disseminated to others and may be manipulated (merged) in ways the individual has not anticipated. Perhaps the assumption should be the reverse; viz., that a corporation may not release any information about an individual unless it has secured that individual’s permission. People do not make a contribution to a database believing that the information will be merged and retained for very long periods of time. Most people are not aware of the ways in which the information is used or can be used.

An important issue with these cases concerns the retention of data for long periods of time. When should data die? The second Ethical Consideration of Canon 5 Canon 5 of the ACM Code states:
“An ACM member, whenever dealing with data concerning individuals, shall always consider the principle of the individual’s privacy and seek the following:

  • To minimize the data collected.
  • To limit authorized access to the data.
  • To provide proper security for the data.
  • To determine the required retention period of the data.
  • To ensure proper disposal of the data.”

The position advanced by Professor Richard A. Wright (“Information As a Commodity: Control and Benefit Are Morally Owed to the Source”), while bold, may yet present us with another occasion “to strike a deal.” That is, Wright’s position, if implemented, might lead people to sell away their privacy. The problem of privacy is not one of proper compensation for use of information about one’s self, but knowing what will be done with information about one’s self. Wright appears to be assuming that since we can’t control the use of information about ourselves, we ought at least to be properly compensated for it (i.e., as market value may determine).3


Reaction to problems of privacy and confidentiality seem to be driven by recourse to technological and economic considerations rather than to normative ones. The dominant way of dealing with problems is also explained (but not justified) by political factors, since adopting normative policies is interpreted as politically risky. Problems are dealt with on a case-by-case basis, not on the basis of an overarching policy. If an overarching policy is preferred, it might be preferable to pursue a federal approach, rather than a local or state approach. In Canada, there exists the Office of the Privacy Commissioner whose function it is to apprise national policy makers of potential problems and issues. In any case, the development of an overarching policy need not and ought not lead to the creation of a large bureaucracy to oversee computer associated privacy and confidentiality problems (as may or may not happen in Europe). The issues of what information ought to be made public about one’s self, and how that information may be used, seems best left to individuals themselves, assuming the development and passage of legislation sufficient to allow individuals to record and enforce their preferences.

While some uses of computer technologies threaten the privacy and confidentiality of individuals, we ought not loose sight of the fact that different uses of the same technology may provide us with new and better ways of safeguarding privacy and confidentiality. And this is the point: it really is a matter of how we use the technologies. For example, consider the case where a proximity identification system is used to limit access to certain buildings; students with the right cards gain access, those without do not. But the same technology can be used to track students, in effect, to determine where and with whom they congregate. License plate readers make passage through toll booths a speedier affair, thereby eliminating long lines. But the same technology can be used by Immigration and Naturalization officers to track “undesirables.”

With the aim of realizing a better understanding of the problems of privacy and confidentiality and, ultimately, with the aim of promoting an ethical practice of computing, the NCCV Working Group on Privacy and Confidentiality offers the following recommendations and policies:
First, the Research Center on Computing and Society should:

  1. Conduct research to determine (a) the kind of information that is being gathered about individual Americans, (b) the uses to which the information is being put, (c) the extent to which the information is being propagated, and (d) the amount of revenue that is being generated through the sale of such information.
  2. Identify existing laws that bear on issues of privacy and confidentiality. If such studies already exist, as is reasonable to suppose, they should be collected. While some individuals believe that sufficient privacy and confidentiality protection already exists in the Bill of Rights, the advantages and disadvantages of an “Electronic Bill of Rights” for the home, school, and work place should be examined.
  3. Pursue avenues of research into the practices of the private sector with special emphasis on the retention and disposal of information about individuals, and the extent to which stored information is stale or outdated.
  4. Investigate the effect of adequate privacy policy on the private sector’s ability to be competitive worldwide.
  5. Pursue the issue, pro and con, of whether there can ever be a morally justifiable exception to the rule that data ought never be used for purposes other than that for which it was originally collected.
  6. Pursue the moral, technological and economic feasibility of notifying persons whenever information about them is being used.
  7. Determine ways in which individuals can preserve anonymity in a technologically advanced society without greatly diminishing one’s quality of life.

Second, the National Science Foundation should:

  1. Press for a presidentially or congressionally funded national commission, with staff, to produce a report on all relevant aspects of computing, privacy, and confidentiality. Such a report should issue in policy options.
  2. Independently of the national commission, NSF should sponsor a “micro-commission” addressed to issues of privacy and confidentiality in academia.
  3. Institutions seeking funds from NSF for purposes of attaching to a network should include with their requests an institutional policy statement regarding issues of privacy and confidentiality. NSF should refuse to disburse funds until such time as privacy and confidentiality policies have been adopted at the requesting institution.
  4. NSF should encourage studies seeking to determine the ways in which computer technologies may enhance equitable access while preserving privacy and confidentiality whenever appropriate.
  5. NSF should sponsor cross-generational computer-centered projects, perhaps under the title “Creating Our Computer Futures.” Such projects should bring young and old together, and exploit excitement for the technology while drawing upon a deeper understanding and appreciation of moral conduct and values. Young and old would produce and learn to use correctly such tools as electronic bulletin boards, thereby addressing the need for greater education.
  6. NSF should fund or sponsor summer institutes for teachers at all levels of education on the topic of privacy and confidentiality in computing. The objective would be to equip teachers with current knowledge and techniques needed to immerse their students in the issues.

Finally, the Working Group supports the following policies.

  1. Producers of databases should be obligated to date their data and to specify the source of the information. They should also state the period of validity of the data, and if unable to do so, should provide visible, unambiguous and otherwise adequate disclaimers.
  2. Database owners, developers, and users should be obligated to act in a manner consistent with the “Code of Fair Information Practices.” In particular, data should never be used for purposes other than that for which it was collected, unless the individual about whom the data is collected is informed and gives consent.
  3. All persons who use computer services such as e-mail, bulletin boards, etc., should be provided with “upfront” information regarding such actions as whether her files may or may not be searched, and the types of files to be searched; whether messages and advertisements are edited and/or censored, in part or in whole, etc; and what the methods of enforcement and penalties are, as well as appeal procedures whenever one is judged to have breached system policy.