Computer Ethics Issues In Academic Computing

– Further issues concerning teaching computer ethics are explored.

Computer Ethics Issues in Academic Computing table of contents

The National Conference on Computing and Values (NCCV) was held on the campus of Southern Connecticut State University in August 1991. The Conference included six “tracks:” Teaching Computing and Human Values, Computer Privacy and Confidentiality, Computer Security and Crime, Ownership of Software and Intellectual Property, Equity and Access to Computing Resources, and Policy Issues in the Campus Computing Environment. Each track included a major address, three to five commentaries, some small “working groups,” and a packet of relevant readings (the “Track Pack”). A variety of supplemental “enrichment events” were also included.

This monograph contains the proceeding of the “Policy Issues in Campus Computing” track of NCCV. It includes the “track address” with two commentaries, four enrichment papers, the conference bibliography, and a report on the activities and findings of the small working group on campus computing policies. The track address is “Computer Ethics on Campus” by Leslie Burkholder; and the commentaries include: “Making a Code of Ethics Work at Pimli College” by Sally Webster and “Intricacy and Impacts of Computing Policies on University Campuses” by T.C. Ting.

The enrichment papers include “Policy and Guidelines: Some Comments as the University of Delaware’s Draft Responsible Computing Policy Nears Approval” by Richard Gordon, “Recommended Guidelines for Responsible Computing at the University of Delaware” by University of Delaware Staff, “The Ethics of Evaluating Instructional Computing” by Marvin J. Croy, and “Some Effects of Computer Technology on Human Interaction and Individualization in the Teaching of Deductive Logic” by Marvin J. Croy, Michael G. Green and James R. Cook.

Marvin J. Croy was the “Track Coordinator” for this track, and the Appendix of this monograph is his report on the activities and findings of the small working group of the track.

The National Conference on Computing and Values was a major undertaking that required significant help from many people. The Editors would like to express sincere thanks to the National Science Foundation and the Metaphilosophy Foundation for support that made the project possible. And we wish to thank the following people for their invaluable help and support: (in alphabetic order) Denice Botto, William Bowersox, Aline W. Bynum, Robert Corda, Donald Duman, Richard Fabish, James Fullmer, Ken W. Gatzke, Steven J. Gold, Edward Hoffman, Rodney Lane, Sheila Magnotti, Armen Marsoobian, John Mattia, P. Krishna Mohan, Beryl Normand, Robert O’Brien, Daniel Ort, Anthony Pinciaro, Amy Rubin, Brian Russer, Elizabeth L.B. Sabatino, Charlene Senical, J. Philip Smith, Ray Sparks, Larry Tortice, Suzanne Tucker.

Computer Ethics on Campus

Consider the plight of Pimli College’s Computing Advisory Committee. (Pimli is, of course, a fictitious college. It is a middle-size private college with a middle-size reputation, located somewhere to the east or west or north or south of here.) There have been several incidents of computer abuse on campus. Computing staff have been reading other people’s electronic mail. Students have been experimenting with computer viruses on public cluster machines. Faculty have been copying licensed software for use at home. As a consequence, the committee has been asked by the head of Academic Computing to do something. His suggestion is that the college should have a Code of Computer Ethics, a set of standards regarding the proper use of its computing facilities. Is this a good idea? Where do they begin? What topics should they include? Should they do something else besides writing up a code? Indeed, should they even bother to struggle with composing such a code at all?

Why a Code?

Sensibly enough, the committee asks first questions first. Why, committee members ask one another and the head of Academic Computing, should we have a computer ethics code? What is it supposed to do?

One answer the committee gets, from the head of Academic Computing, is that it will encourage reasonable behavior with computers on campus. It will reduce, though it won’t guarantee to eliminate, computer abuse. Of course, he says, it will have to be put into student, staff, and faculty handbooks to have that effect. Perhaps it will have to be distributed or lectured on at orientation sessions for new members of the college community. Perhaps everyone will even be asked to sign a statement saying that they have read and agree to abide by the code. But improved ethical behavior is what the head of Academic Computing plans to get from the code. (The response is not unique to computer misuse, of course. An informal survey at Carnegie Mellon University recently uncovered lots of cheating on exams and class assignments. One response was: institute an Honor Code, it will reduce the incidence of these events. (The Tartan, 29 April 1991.)

Committee members, some of them skeptical scientists, wonder: How will a code do this? In fact, is there any evidence that codes do this sort of improving thing?

So the head of Academic Computing elaborates. People may misuse campus computing facilities because they know no better or because they aren’t motivated to do better. The code, he says, will either make members of the campus community more knowledgeable or more motivated.

Sometimes people do wrong because they haven’t realized that certain activities are wrong and harmful. Perhaps staff read other people’s electronic mail because it is so easy to do that it is hard to realize that it’s wrong. In that case, the code can, as the jargon has it, raise their consciousness or awareness. Sometimes people are puzzled about what the right course of action is. May a researcher, for example, look at files recording a student’s revisions of his essays, without that student’s permission, in order to complete a study of how a writing tool is used? The code can, at least sometimes, provide the answer and so put the puzzled person on the right path.

Sometimes people know, for example, that copying licensed software or reading another’s files is thought to be wrong. Sometimes they are even pretty sure in their own minds that these are wrong. But like someone who is pretty sure that eating that pint of Ben & Jerry’s ice cream is not really the best of things to do, temptation can win. A code, especially a code with punishment for violations, says the head of Academic Computing, can provide an extra boost of motivation to do what’s right.

In addition, he points out, disciplining people for computer abuse without an explicit code or other warning can itself sometimes be wrong. What people, especially students, often are heard to say when disciplined is, “No one told us we shouldn’t do it. No one told us we would be punished. No one bothered us about it before.” If so, how can they be justly disciplined? Of course, the reasoning here may sometimes be a little suspect. A person is excused from punishment if he can sensibly have thought his actions were innocent of wrongdoing. If you find yourself at a software vendor’s booth at a computer show, in front of what all the signs suggest is a stack of free demo disks, no one can complain if you help yourself. But can it really be true of students or faculty or others who copy licensed software for home use that they have no reasons to think it a bad act? On the other hand, at least if it’s said in the college computer ethics code that it’s wrong and punishable, then the excuse of innocent ignorance is unavailable.

By now, other members of the committee have thought of reasons for having a code.

One member, a user consultant who spends her time out in the clusters, has thought of a dark reason. Sometimes, she says, people know what’s right and would like to do it. They are not tempted by their own independent desires to do wrong. What happens is that they are pressured by their superiors or circumstances to do wrong. We all hear about this sort of thing in industry, she says, but it happens in schools and universities too. A student assistant is told by a professor to look through computer files the professor hasn’t permission to look through. The new and nervous manager of computer clusters is told to make sure that there are sufficient copies of a statistics package but is not given a big enough budget to achieve this. Even if these people would like to do what’s right, can they resist the pressure that might be put on them to do something that’s wrong? Perhaps, she hazards, a code can help. They can point to its provisions when refusing to comply with such requests.

The software acquisitions manager and the college’s legal counsel jump in. The software acquisitions manager says she thinks it would be easier to work with software vendors, could she give them some assurance that their software wasn’t being pirated. She believes the college’s adoption of a code, with appropriate discipline for illegal software copying, would help. At least, she says, it would show them that Something Is Being Done. The college’s lawyer is worried about liability matters (cf Johnson, Olson, and Post, 1989). While he can’t predict what might happen with certainty, he is worried that a software company might sue for illegal software copying or that a business or private individual might sue for damages caused by a loosed computer virus started on campus or for damages caused to files by student hackers snooping from campus machines. In any of these cases, he says, the courts might treat a member of staff or faculty or even a student as an agent of the college, as someone acting with its permission. The lawyers then would go after the college because it has deeper pockets. Having a code might help protect the college. He, the college’s lawyer, could argue in court that it had taken precautions against just such actions by its members. The computer abusers were not acting as agents of the college and so, whatever their personal prospects, the college itself shouldn’t be held liable.

Perhaps, says one of the skeptical scientists on the committee, what the software acquisitions manager and the college’s lawyer say are true. No doubt they have the knowledge and experience to judge. The head of Academic Computing certainly has a nice theory about the benefits of a code. The user consultant certainly tells a terrible tale. But will a code help? Does anybody know whether it will have the good effects claimed for it?

A business ethics professor has an answer. Many businesses have codes of conduct, he explains. Lots of companies adopt them after misconduct by employees at various levels is uncovered (Lee, 1986). Do they have a good effect? Do they reduce misconduct? Well, he says, there are lots of testimonials by High Company Officials that they do. But these aren’t really worth much more as proof of their effect than testimonials from famous athletes about the benefits of breakfast cereals or training shoes. There is the case of the bribery scandals in the ’70s. Lots of US companies, after they were discovered to have been bribing government officials in foreign countries for one reason or another, adopted codes forbidding bribes. Bribing incidence fell. Did the codes cause that? Maybe, but too much else was going on, government action on bribery and a public outcry, for example, to tell.

On the other hand, he says, there is something pretty solid. Two professors ran some experiments at about the time of the ’70s bribery scandals (Hegarty and Sims, 1979). They wanted to determine, among other things, the effect of ethics codes on willingness to use bribes. The subjects were business school students. They were divided into two comparable groups. Members of both groups were to act as sales managers for a company. Both were told that bribes would increase business. One group was told that unethical behavior was against company policy. And the researchers were reasonably sure, from prior questioning, that all their subjects thought it wrong, unethical, to give bribes. The group told that unethical behavior was against company policy bribed less.

Code Content

OK, the committee says, we should have a code of computer usage. Maybe it will do some good. Certainly it can do no harm. But what should be in it? What topics? And what should be said about those topics?

Reasonably, the members of the committee start by looking at codes from other schools. They do a literature search, of sorts. In all, they look at 38 codes. They have some help. Many codes have been collected together and are now made available by an EDUCOM committee (EDUCOM Review, Winter 1990, p 63). The codes are from universities and colleges of all shapes and sizes, most from the US but even a few from abroad. Of course they find great differences among the codes. Some are very brief and not detailed. Others are lengthy and specific. And they find substantial variation in content. (See Figure 1.)

Figure 1. Topics in College Computer Ethics Codes

Topic Codes in which topic is discussed
Disciplinary action resulting from code violation 30/38 (79%)
Respect for the property and privacy rights of other users 27/38 (71%)
Respect for software license agreements, copyright and patent laws 25/38 (66%)
Encroaching on other user’s fair use of system resources 23/38 (61%)
Harassing or annoying other users 22/38 (58%)
Use of system or its parts only for authorized purpose 22/38 (58%)
Unauthorized lending of accounts, User-IDs, passwords 18/38 (47%)
Degrading system performance 16/38 (42%)
Use of system or its parts only by authorized users 13/38 (34%)
Exhaustiveness of code 12/38 (32%)
Computer-assisted plagiarism 11/38 (29%)
Unauthorized modification of system (software and hardware) 9/38 (24%)
Where to go for clarification of code provisions 7/38 (18%)
Criminal code provisions 7/38 (18%)
Underlying commonsense ethical principles 6/38 (16%)
Circumventing accounting procedures 6/38 (16%)
Exploiting loopholes in system security 5/38 (13%)

Most codes, but not all, say that disciplinary action will result from a violation. Disciplinary action varies with the severity of the computer misuse. Sometimes it includes dismissal from the university. Often possible criminal prosecution, where code violations also break the law, is mentioned as well. Most encourage respect for the privacy and property rights of other users of the campus system. They also have something to say about what those rights are. The right to privacy, at least in this context, is typically thought of as the right not to have your files or communications examined by others without consent. Property rights include rights that others not take or alter your files, for example your program or data files, without your consent. Sometimes it is pointed out that lack of certain kinds of read – or write – or copy-protection on files does not mean a user has consented to others reading, changing, or taking the files, any more than an open door signifies consent to enter an office or borrow or fiddle with any of its contents. Many codes also explicitly say something about respect for software licensing agreements and copyrights. Often they quote big chunks of the EDUCOM resolution on intellectual property rights. Some, along with the encouragement of respect for the property rights of other users and software vendors, inveigh against computer-assisted plagiarism.

About 2/3 of the codes the committee looks at say something about using campus computing facilities in a way that acknowledges other users’ needs for perhaps scarce, certainly finite, facility resources. Some, for example, talk about playing games on student cluster microcomputers, workstations, or terminals that are provided for doing homework assignments. Most talk about not misusing printing services to make unnecessary copies or misusing disk or other storage to save useless files. The codes often also discourage harassing or annoying these other users by, for example, sending them electronic mail of a kind they find objectionable or pestering them in clusters.

Most codes also have something to say, in various ways, about authorized or permitted use of school computers. In general, of course, they insist that only members of the campus community or authorized others are permitted to use its computing facilities, just like the gymnasium or library. Students who have dropped courses which gave them accounts, for example, may no longer be authorized to use the computing system or particular parts of it. Sometimes certain machines can only be used for certain purposes or by certain groups. They may be machines purchased on a grant for a research project, and so not for general use. More often than not, it is said that university or college computing facilities cannot be used for commercial purposes, and can be used for outside work, consulting and research, for example, only with special permission. Many codes prohibit unauthorized alteration of school-owned software and hardware. Many forbid activities that would degrade system performance or get around accounting provisions, but some also explicitly forbid unauthorized modifications that might be believed by hacker modifiers to improve system performance or abilities (Levy, 1984).

The committee also finds that some codes talk about underlying commonsense moral principles that guide the code’s specific provisions. Often, it is remarked that the ethical rules about privacy and property and fair usage of common resources that apply elsewhere also apply to computers and their considerate use. Along with this it is suggested that the code’s specific rules do not cover every conceivable circumstance, only some of the more obvious and typical ones. Thus, reference to these more general principles is needed for unmentioned circumstances. Sometimes, however, instead of referring to everyday ethical principles, the ethics codes refer to or quote from provisions of the local, usually state, criminal law to provide content.

What does the committee find is said about these various topics? Again, great variation. All codes that mention the topic, for example, agree that users shouldn’t harass others. All that mention the topic agree that copyrights and site licenses should be honored. But some codes assert things others deny. For example, some schools assert that they own all electronic files, not users. Others believe that users own files they create, much as students and faculty presumably own papers they author. Some forbid absolutely recreational game-playing on their machines. Others allow it when, by some settled indicator, usage of the system by people doing research or homework is low. Some absolutely forbid users, or at least student users, to lend their accounts or user “IDs” to others, even if those others are associated with the university or college. Other codes are more lenient, only holding the account owners responsible for any consequences of a loan.

Why, the committee members wonder, is there such variation in content of codes, both in topic and what is said about a topic? And does this bear on what our college’s computer ethics code should look like?

Some variation is easily explained. Perhaps some don’t mention discipline for rule-breaking because the subject is dealt with elsewhere in a handbook from which the code is excerpted. Codes which do not explicitly discuss computer-assisted plagiarism may take it for granted that plagiarism of all kinds is sufficiently discouraged by a general prohibition stated somewhere else in student, staff, and faculty handbooks. If X is wrong, so is X-done-with-a-computer. Some differences are easily explained by differences in the nature and extent of computing facilities available. Some college’s codes may not mention skirting around accounting procedures because they have none. Everyone uses a stand-alone microcomputer and there are no computing accounts. Some have a richer environment, with many machines, and thus are not so hard-nosed about recreational game-playing. The Pimli College committee has, therefore, to find out things like what is said in other of its own college codes of conduct, to find out what the particular arrangement of its facilities is, to decide how lenient it can afford to be about non-schoolwork uses of its machines, and so on, to write its code.

But, asks a philosopher on the committee, what about the difference over who really owns user’s files, what their privacy rights are, and such matters as whether they can loan out their accounts or user “IDs” or give their passwords to others? There is, he thinks, a fundamental problem about ethics codes (cf. Ladd, 1980; DeGeorge, 1990, p.389 – 390), and it shows up in these differences among the codes.

These codes, he says, might be supposed to be statements of right and wrong, statements of already existing moral or ethical truth, concerning the proper use of computers on campus. It looks like some code-writers think of them this way. But then certain things follow. For example, it is not up to the committee to decide to make certain activities right or wrong, any more than it is up to Congress to decide that kidnapping is wrong. The committee couldn’t, for instance, adopt a rule that said that harassing or annoying other users is okay or that reading their e-mail without consent is fine, if the claim of the code is to record what is right and wrong. Those rules would just be incorrect.

Maybe, on the other hand, he says, these codes are pieces of legislation or rules like those for the college’s parking lots. While there are certainly limits, the committee can then pretty much say what it wants in the code. There are no moral truths about who can use which parking lots on campus. There’s nothing in ethics about allowing or not allowing users to loan out their user “IDs”. The committee just says that this is the way things are set up at Pimli. It’s like a landlord-tenant agreement. Pimli is the landlord and the users are the tenants. Maybe the committee will have reasons for having certain rules in our code, just like landlords do. Some of them might even have some close connection to ethics, like the rule prohibiting harassment. But some of them might have only more mundane reasons, like the rules about who can use which parking spaces. Maybe, for example, there are reasons – reasons of efficiency or convenience – which lead to restrictive rules about the loan of accounts and user “IDs”, or maybe there aren’t.

Anyway, he says, the committee has to decide which of these two kinds of animal the code is supposed to be. That determines some of what can be said in the code.

A Code and More

After much to-ing and fro-ing, the Computing Advisory Committee adopts a Code of Computer Ethics. It is, it turns out, very much like the Model Code devised by Chuck Augustine (see “Appendix – A Model Ethics Code for Computer Usage” below). So the head of Academic Computing is satisfied. But the committee thinks perhaps it should do more. The business ethics professor knows, for example, that businesses and professions certainly sometimes do more (Velasquez, 1990; Lee, 1986).

What businesses do, he explains, is run ethics training workshops. They do this in addition to having a code of conduct. The workshops can run for a morning or afternoon or several days. Sometimes they are run by people within the organization, often the managers of those taking the workshop. Sometimes they are run by outside instructors, specialists in business ethics. They can include a number of items: an overview of the company’s code of ethics, the discussion of several hypothetical cases that illustrate the code’s provisions, the discussion of some cases that involve moral dilemmas possibly not addressed by the code, an explanation of any assistance the company provides when employees are faced with a moral problem not clearly resolved by the code, and a discussion of things that might discourage ethical behavior in the company. Sometimes videotapes are used. Sometimes printed materials. But whatever their other differences, these workshops include the discussion of cases in which a decision needs to be made or commented on. And many of the professions do something similar; “ethics rounds” for doctors, for instance, are well known (Macklin, 1987).

Okay, so what are these case discussions like, asks the user consultant? I certainly have a case. What’s going to be said about it? (See “Figure 2. The Love Slave Advertisement”below.)

Well, says the business ethics professor in reply, we’d look at the code and see whether it forbids or allows the activity. It would be a discussion much like that among lawyers and justices determining what the Constitution forbids or permits. These discussions are intended to illustrate the code and help people understand what its rules mean. If you look at our new code, you can see that it would forbid the Love Slave posting. Notice the code says that public messages should respect the dignity of all users of the system. This one doesn’t; it’s offensive to women and Filipinos (Kerr, 1991). Some users might not appreciate that fact, thinking they were just being funny, and so not see how the code applied.

Figure 2. The “Love Slave” Advertisement

  • Like many universities, Pimli College makes computer bulletin boards available to all its members. These are like the physical bulletin boards that also exist on campus in many ways. Many of the electronic bboards can be posted to and read by anyone at the college – indeed by anyone with access to the Internet. Like many physical bulletin boards, they are segregated by subject matter. Some are devoted to public discussion of the computing systems available on campus, to a particular computer language, or to a particular course. Some are devoted to complaints about campus dining services. One, in particular, is called “Market” and is dedicated to advertisements for items for sale or wanted. The following advertisement recently appeared on this bboard: “Date:7 Feb 91 00:55:39 – 0500 (EST)|
    From:Kevin W.
    To:Market
    Subject: Filipino Love Slave!!!!!
    For Sale… One Filipino Love Slave… mid 20s… black hair, blue eyes, experienced in all sex techniques… familiar with over a hundred different sex positions… qualified expert in oral, bondage, anal, and ball-busting methods. Will provide own equipment: chains, whips, rubber body suits, studs, dildos, etc… Will be available for sex at any moment and built to deliver the goods. If interested, will provide picture free… $10 for girl buck naked and in heat… sizes 36-22-34. Price $800 negotiable… dirt cheap!!!!!”

All right, that’s clear enough, says the skeptical scientist who earlier asked about evidence for the effectiveness of codes. But I want to know how you know these discussions make a difference.

That’s easy, too, says the business ethics professor. There’s plenty of evidence, he continues, that discussion of hypothetical cases, especially those that concern some sort of dilemma, improves people’s ability to reason about moral matters (Schlaefi, Rest, and Thoma, 1985). Some of Kohlberg’s work on moral development was devoted to gathering this kind of evidence (Kohlberg and Turiel, 1971; Blatt and Kohlberg, 1975).

Hold on, the philosopher interrupts. There seem to be two kinds of discussions, he says. You’re mixing up yours with the kind of discussions Kohlberg investigates. There’s the training session. This is what you described. And probably they make some difference. Certainly you can expect people to benefit from help in understanding rules or practice in their application. That’s true for the construction of proofs in geometry and language learning (Anderson, 1991). Why should things be different with the rules in college ethics codes? So perhaps these are – for certain audiences, people new to the college and its ways – the most appropriate items. But besides the training sessions there are the real discussions, the searches for the truth. Your training session doesn’t ask, for example, whether our code should forbid such a posting. It takes the code for granted; its wisdom is not to be doubted. In fact, it’s these sorts of discussions that Kohlberg found effective at improving people’s reasoning in ethics.

But I don’t understand now, says the user consultant, what you’d do in these discussions, what you’d say, how you’d conduct them.

I can certainly think of some things to say about the Love Slave posting, replies the philosopher. People say, for example, that everyone has an inalienable right to freedom of speech or expression. Although some have in mind only a First Amendment constitutional right, others have in mind something else, a moral or human right, one we’d have whether or not our or anyone else’s constitution said anything about freedom of speech. It’s one we can’t give up or lose; it comes with being human or a person. That’s what they mean by its being inalienable. All the poster was doing was exercising his right. So he can’t have done anything wrong.

But the reasoning here isn’t so great, continues the philosopher. For having a right to freedom of expression doesn’t make every exercise of it right. It doesn’t give you blanket permission to lie, or to sing loudly, even political songs, in the dorms after midnight. Others would rather not appeal to rights but to the effects of the posting. Certainly, there is something wrong – at least inconsiderate – about the posting. It’s not that it is offensive, but how it gets to be so. It uses a prejudice against a disadvantaged group, one that hasn’t traditionally had the resources to even up the odds against this kind of prejudice (cf. Glass, 1978). Suppose, for example, the advertisement talked about a white male golfer instead?

Of course, continues the philosopher, some of this discussion may bring up the code. For instance, people may think that having a right to free speech means that the college can’t forbid any speech without violating that right. But other rights – for example, property or privacy rights – do not remove the possibility of restricting their exercise without doing wrong. Some may say that the college should allow speech a wide – perhaps even absolute – swath of freedom because of its beneficial consequences. After all, college is a place for research and education. Freedom to express and criticize ideas is needed for the investigation of their worth (Mill, ch. 2; Schauer, 1982, ch. 2) and for the growth of autonomous individuals (Mill, ch. 2; cf. Scanlon, 1972). But this, of course, doesn’t show that the Love Slave posting was okay. Finally, that it was wrong or inconsiderate doesn’t show that the code should forbid it. Perhaps, for learning purposes, it is best to leave such cases to public rebuke rather than instituting official sanctions.

And as for the method of discussion, there are several (Hannah and Matus, 1984; Hall and Davis, 1975, ch. 8). They include statements of the facts of the case and the offering and criticism of moral reasons favoring the evaluation of one choice over another. Both students and instructors participate in these offerings and criticisms. The result may not be a conclusion or it may be one that best sits in “reflective equilibrium” with all that’s said (Daniels, 1979). What’s important is not only the conclusion, if there is one, but the reasoning in support of it as well.

Carnegie Mellon University

References

  • John Anderson, Cognitive Psychology and Its Implications, W.H. Freeman, 1991.
  • Charles Augustine, “The Pieces of a Policy: Categories for Creation of a Computer Ethics Policy,” SIGUCCS’89, Association for Computing Machinery, 1989, pp. 163 – 67.
  • Moshe Blatt and Lawrence Kohlberg, “The Effects of Classroom Moral Discussion on Children’s Level of Moral Development,” Journal of Moral Education, Vol. 4, 1975.
  • Norman Daniels, “Wide Reflective Equilibrium and Theory Acceptance in Ethics,” Journal of Philosophy, Vol. 76, 1979, pp. 256 – 82.
  • Richard DeGeorge, Business Ethics, 3rd ed., Macmillan, 1990.
  • Thomas J. DeLoughry, “Widespread Piracy by Students Frustrates Developers of Computer Software,” The Chronicle of Higher Education, August 12, 1987.
  • Marvin Glass, “Anti-Racism and Unlimited Freedom of Speech: An Untenable Dualism,” Canadian Journal of Philosophy, Vol. 7, 1978, pp. 559 – 75.
  • W. Harvey Hegarty and Henry P. Sims, Jr., “Organizational Philosophy, Policies, and Objectives Related to Unethical Decision Behavior: A Laboratory Experiment,” Journal of Applied Psychology, Vol. 64, 1979, pp. 331 – 38.
  • Michael Gemignani, “Copyright Law as it Applies to Computer Software,” The College Mathematics Journal, Vol. 20, 1989, pp. 332 – 38.
  • Robert T. Hall and John U. Davis, Moral Education in Theory and Practice, Prometheus Books, 1975.
  • Larry S. Hannah and Charles B. Matus, “A Question of Ethics,” The Computing Teacher, August/September 1984, pp. 11 – 14.
  • David R. Johnson, Thomas P. Olson and David G. Post, Computer Viruses, United Educators Insurance Risk Retention Group, May 1989.
  • Deborah G. Johnson, Computer Ethics, Prentice-Hall, 1985.
  • Thomas M. Kerr, “Harassment Undermines Free Speech,” The [Carnegie Mellon University] Tartan, February 18, 1991.
  • Lawrence Kohlberg and Elliot Turiel, “Moral Development and Moral Education” in G. Lesser, ed., Psychology and Educational Practice, Scott Foresman, 1971, pp. 410 – 65.
  • John Ladd, “The Quest for a Code of Professional Ethics” in Rosemary Chalk et al., eds., AAAS Professional Ethics Project, American Association for the Advancement of Science, 1980, pp. 154 – 59.
  • Chris Lee, “Ethics Training: Facing the Tough Questions,” Training, March 1986, pp. 30 – 41.
  • Ruth Macklin, Mortal Choices, Pantheon, 1987.
  • John Stuart Mill, On Liberty.
  • Donn B. Parker, Susan Swope and Bruce N. Baker, Ethical Conflicts in Information and Computer Science, Technology, and Business, QED Information Sciences, Inc., 1991.
  • Thomas Scanlon, “A Theory of Freedom of Expression,” Philosophy & Public Affairs, Vol. 1, 1972.
  • Frederick Schauer, Free Speech: A Philosophical Enquiry, Cambridge University Press, 1982.
  • Andre Schlaefi, James R. Rest, and Stephen J. Thoma, “Does Moral Education Improve Moral Judgment? A Meta-Analysis of Intervention Studies Using the Defining Issues Test,” Review of Educational Research, Vol. 55, 1985, pp. 319 – 52.
  • Judith Jarvis Thomson, “The Right to Privacy,” Philosophy & Public Affairs, Vol. 4, 1975, pp. 295 – 314.
  • Manuel G. Valasquez, “Corporate Ethics: Losing It, Having It, Getting It” in Peter Madsen and Jay M. Shafritz, eds., Essentials of Business Ethics, Penguin Books, 1990, pp. 228 – 44.

Making a Code of Computer Ethics Work at Pimli College

1.0 Life at Pimli After the Code is Adopted

Congratulations to Pimli College for doing so many of the right things when it decided to write a Code of Computer Ethics. The code writers investigated the constraints by which they are already bound (state and federal laws, present student and faculty codes of conduct, and network guidelines); discussed what they hoped to accomplish with a code; assembled interested parties from a variety of campus perspectives to discuss and write; looked at models from other institutions; made up a list of problem behaviors; and talked about how to raise awareness of the issues (beyond merely writing the Code).

They could also have discussed two other, related, areas: (1) the ethical environment already operating at Pimli, and (2) activities and attitudes necessary to make it possible for the Code to be taken seriously. If the ethical environment is inimical to ethics codes, or if only Pimli’s Computer Center is left to support the Code, its framers may have worked to no purpose.

For about eleven years, I was a professional academic computing services (ACS) person; for five of those years, one of my responsibilities was talking to students and faculty who had breached, sometimes without realizing it, an ACS policy. In that role, I was able to talk to offenders and understand some of the reasons they misunderstood or chose to flout a policy. From my colleagues and staff at ACS, I heard many ethics “war stories,” and together we searched for ways to explain to our users why the policies were there and what the consequences would be if illegal or unethical computer and network activities continued unabated.

For the past year, I have been a professor in a small science, technology, and design school which is a part of the State University of New York. In this role, I see the forces at work on students and faculty which keep them from examining too closely the way they treat software, privacy, intellectual property, and network access. I teach five courses a year in which I can directly influence students (most of them freshmen) to discuss ethical issues. I consult with faculty colleagues often enough to suggest the dilemmas which should be considered. My remarks on this topic, then, have been forged by experience on both “sides” of the faculty/staff divide.

Judging from the number of people trying with great difficulty to do it, writing and adopting a Code of Computer Ethics is a tremendous achievement. However, once it is written and adopted, will it do what the framers want? Or will it be a shield behind which the college can hide from lawsuits?

Let’s assume that the ethical environment at Pimli is “typical,” by which I mean that some parts of the college take ethical issues very seriously and others don’t think about them much at all. Furthermore, rules apply more to students than to faculty or staff, and students pay more dearly for mistakes than do others. In such an ethical environment, how will the Code be supported and by whom? (“Support” here means “understand and disseminate and discuss and explain and enforce and uphold.”) Could the writers of Pimli’s Code have been wasting their time? Will the Code be published but ignored? Will the faculty/student double standard apply to it as well? Will college administrators have the guts to take action when the provisions of the Code are violated, or will the all-too-real threat of lawsuits brought by the offenders back them down? For such a Code to do what its framers want, it must be supported by the right players making the right plays.

2.0 The Players and the Plays

2.1 Top Administrative Officers

To be really effective, a Code of Computer Ethics must be supported right from the top – the Board of Trustees and the Chancellor or President of the college. As long as the people with the most responsibility shirk it, administrators and faculty can tell themselves that it’s not a high priority of the college.

The Chancellor or President, as an agent of the Board of Trustees, can announce support for the Code and mention it from time to time to all his audiences, including the college and town news organizations. EDUCOM wrote a code covering intellectual property rights which many college and university computing organizations have endorsed and publicized. But at many of these institutions the highest officer has neglected to embrace it, and so it is not official institutional policy but remains a stepchild.

The next administrative rank (vice-presidents, provosts, vice-provosts, and directors) must support the Code at their level and below by passing down to the next level both the letter and the spirit of the Code. Meetings of the faculty, or deans, or directors and college unit retreats are venues for discussions of the ideas behind such a Code, its provisions, and

sanctions. Internal school, department, or division newsletters are good vehicles for discussion and explanation.

The academic officers of the college can widely disseminate in student publications and handbooks the Code, the ideas behind it and the consequences of ignoring its provisions. Further, these officers can decide how vigorously to enforce provisions of the Code and how judicial and policy review boards shall handle individual infractions.

The institution, through actions and decisions of its academic and administrative officers should prepare to spend the money necessary to buy legal copies of all software which is sponsored in central or departmental clusters and labs by the institution. The perhaps unintended message given by having illegal software in a college cluster or lab is easy to understand and hard to overcome.

2.2 Top Academic Officers

The various officers of faculty governance and academic responsibility, such as deans, department chairs, and faculty senate officers and appropriate committee chairs, must reinforce the support for the Code coming from above. The message that ethics, particularly in this instance computer ethics, counts cannot be heard too often.

Deans and department chairs must find creative ways to get legal copies of software to their faculty and students, especially if they are encouraging the faculty to “computerize” the curriculum. Even usually ethical faculty members will be tempted to use unauthorized copies of software if they are pressed to include computing in their courses while denied the departmental funds to buy enough copies; ethically lax faculty members use the economic argument to rationalize their habitual behavior. Strong messages about computer ethics, coupled with vigorous efforts to support ethical behavior, will have a salutary effect.

It’s ironic that faculty members who understand very clearly the effects of having their own work stolen or used without citation will often steal software with impunity. Top academic officers need to point out, often, the inconsistency in these behaviors.

2.3 Faculty Members

Faculty members, especially those who require students to use computers or computer networks in their courses, must support the Code in their courses, their faculty meetings, and their labs.

Faculty members can discuss and explain with their students the ethical dilemmas which arise through the use of computers and computer networks; the provisions of the Code which apply in their classes; and the consequences of not knowing and abiding by the Code.

Faculty can also devote one class period to awareness-raising, by using scenarios which put the students into likely computer ethical dilemmas and letting them discuss how they would react. [I did this myself and would be glad to share materials with anyone else who’s interested.] The purpose of this “workshop” is to raise issues, make the connections between ethical principles understood in other contexts and the computer situations (such as plagiarism and software theft) and stimulate thought, rather than to “lay down the law.” However, it’s useful that early in any semester students understand the Code and its sanctions.

Faculty can decide ahead of time how they will respond when they see that a student has breached the Code or is heading in that direction. This works best if the students have in writing early in the course what is expected of them and what will happen if they don’t meet those expectations. Faculty have lots of latitude with this, but since computer ethics is a “new” topic (newer, say, than garden-variety plagiarism), this might work better if the faculty member checks with the department chair or dean or dean of students. The more the faculty is unanimous in its expectations and responses, the easier it is for students to understand.

Faculty members requiring their students to use institutional computing resources should check with computer center staff to be sure adequate resources exist. Otherwise, some students will believe themselves “forced” into making illegal copies of software or data to get their work done on time. Faculty can also investigate using texts which come with student

versions of software. Furthermore, faculty can often be flexible when suggesting or requiring software, so that students can use less expensive software packages or software they already own.

Faculty communications about ethical dilemmas to graduate students some of whom will become the next generation of faculty members, are powerful. If the college wants to influence succeeding generations of students, graduate students must help support the Code,

both while they are TAs and lab assistants and later when they are faculty.

And under no circumstances should a faculty member engage in any computer activity which violates a contract, invades the privacy of a fellow faculty member or student (unless those people have been warned that this is a common practice), shows disrespect for another’s intellectual property, wastes shared resources, or harasses others. Students have antennae exquisitely tuned to detect hypocrisy.

2.4 Computing Staffs

Central computing services staff and the staff of any departmental computing labs and centers, who have, for a long time, been the voices crying in the wilderness, will continue to be active in this area.

The Computer Center can adopt operating procedures which support the Code, such as refusing to help any student or faculty member who is using stolen software or trying to use networks to send chain letters. System administrators can refuse access to timeshared systems to abusers. Computer staff can post the Code and their own operating procedures

prominently in clusters and labs, print them in newsletters; and give them out in handbooks and on account application forms.

Computer Center staff can adopt and publicize codes of computer ethics which complement their own Code, such as those published by EDUCOM and ADAPSO and by the various academic networks such as BITNET.

Computer Center staff can use adequate technical safeguards on timeshared computers, such as using passwords and asking users to lock disks. Users should be taught to protect themselves against computer abuses.

Computer Center staff can link microcomputers in clusters with network software which prevents unauthorized copying.

2.5 Student Government and Leadership Groups

Student governmental bodies and Greek Society leadership must be part of the support of the Code, or else students will see it as yet another set of onerous rules imposed on them by faculty and administrators. These groups can lobby to be included on decision-making bodies; they can foster discussion of the Code in their publications and at their meetings.

Pimli College apparently did not have any undergraduate or graduate student representation on its Computing Advisory Committee, and that is a mistake, I think, for the reason cited above.

2.6 Judicial Bodies

Student and faculty judicial bodies and personnel units responsible for explaining and enforcing college policies should support the Code, lest the college mirror society at large wherein the police vigorously arrest offenders but believe the judges let them off too lightly.

Judicial and policy review boards must be helped to understand the seriousness of breaches of computer ethics policies. Too often, because they lack experience in this specific area, they do not make the connections between ethical dilemmas induced by computer use and those with a longer human history. The members of these boards should call on computer center staff members for explanations of how computers and computer networks can be abused and on law school professors and philosophers for discussions of privacy, intellectual property, harassment, and contract law.

Judicial bodies must be helped to act strongly and consistently when deliberating on computer abuse cases. They need to be backed up by top administrative officers and by the college’s legal department.

2.7 Legal and Purchasing Departments

Purchasing and legal departments are important players in this exercise. Often Purchasing staff notice that a person or an office is ordering hardware without enough software to assure enough legal copies. Purchasing staffs can remind the person ordering that software (legal, inthis case) makes hardware work.

The college, through its purchasing, legal, and computing organizations, should work with vendors to forge the kinds of partnerships and make the kinds of deals which encourage the institution and its faculty, staff, and students to buy software, rather than steal it. The fact that a particular vendor charges a “high” price for software does not excuse theft; however, if both vendors and institutional representatives are seen to be trying to do their part, faculty, staff, and students can be more easily pressed to do theirs.

3.0 Thoughts at the End

Pimli College isn’t much different from the college or university each of us is familiar with. In each Pimli, some people care deeply about the ethical climate, about the role of faculty and staff behavior in the lives of our students, about the obligation we have to encourage ourselves and our students to reflect on the ways in which we treat each other. In each Pimli, lack of reflection about ethical issues, ethical relativism, cynicism, and the press of time prevent the people who care from working effectively at the right levels to make a change.

As good as it is to have a Code of Computer Ethics, it will be only as good as the people who support and defend it. Many of today’s students want to do the right thing, want to know what “the right thing” is. But thinking about the right thing is hard and uncomfortable work, and we staff and faculty who will inevitably influence some students are obliged, I believe, to think aloud about the uncomfortable ethical issues, to struggle in public with the dilemmas, and to help our students think and struggle so they become thoughtful and ethical members of society.

SUNY College of Environmental Science & Forestry

Intricacy and Impacts of Computing Policies on University Campuses

Leslie Burkholder has presented an excellent case of developing a Code of Computer Ethics on a university campus. The proposed code has been discussed as a means for motivating reasonable behavior to do what’s right, eliminating computer abuse, or protecting the university from being sued for liability. A code of computing usage reflects many wishes of an implicit campus computing policy which promotes the use of computer systems to support the educational and research activities of the university. The intricacy of developing such a code is clearly demonstrated in Mr. Burkholder’s case.

In spite of many attempts to promote the use of computing technology in higher education, such use has progressed rather slowly. There are some technological constraints; but, these are far fewer than the social constraints. A policy to promote the use of computers on campus is the reflection of faith that the technology will benefit the university. How strong the policy should be depends directly on how much faith we have in the new technology. Powerful tools always have powerful effects. To take on the task of developing a campus computing policy, one must consider the potential benefits that the tool might provide and the possible effects on institutional changes and consequences that the tool might cause.

To address campus computing policy issues and their impacts requires either extreme audacity or naivetê. The subject is so complex, having an enormous effect on the educational system itself, that no definitive statement could possibly be made at this time. Probably this is the reason why there are very few explicit computing policy statements to be found on American campuses.

The difficulties in developing such statements have been demonstrated by Mr. Burkholder’s example of the development of a code of computer usage. Disclaiming being audacious or naive, I would like to modestly offer to identify some intricate issues and to explain why these issues are complicated and have important effects.

The technology itself will not cause problems. The university or individuals in it, by their policies or allocation of resources, determine the speed and nature of possible institutional changes due to the introduction of the technology. Adaptations and changes are necessary in order to derive the desired benefit from the new tool. The use of such a powerful tool has been associated with many potential effects and impacts, or even unforeseen consequences. In order to develop a viable policy, the feasibility must be assessed of a realistic implementation which can deliver a certain level of anticipated benefit and can prevent undesirable consequences to some extent.

1.0 An Interlocking System of Technology and Human Action

The introduction of computers to campuses is mainly driven by technological advancements. In many cases, we are simply adopting new systems as they become available. The people who are developing the technology often do not look at the whole societal picture, and there is no well-established and accepted means of looking at the whole. Perhaps, the major concerns for the development of campus computing policies are what and how new technologies should be appropriately and effectively used to enhance our higher educational systems.

Technology consists of a system of implements and a system of human organization. A campus computing policy must consider an interlocking system of technologies and human actions in the pursuit of higher educational goals. Education must relate to a way of life. More attention should be paid to how the application of technology may be used to help individuals learn more effectively and how the campus must change to prepare individuals to live more effectively in a technological society. The present higher educational system is certainly not optimal in solving the problems of this technological age. Campus computing policies should not only address operational issues concerning the use of computer and network systems, but they should also consider the necessary changes in the educational system itself.

A number of fears have been voiced with respect to the use of computing and network technologies as indicated in the case presented by Mr. Burkholder. We have a fear about potential abuses. Is it possible that we actually fear the potential institutional change itself?

2.0 Planning Computer Usage for Educational Systems

Use of computers has an impact on university educational systems in several ways. It enhances the learning process, it provides new tools for increasing research creativity, and it introduces new tasks for colleges and universities. More importantly, it may cause structural and organizational changes.

The use of computer technology should be carefully studied, planned, implemented, and managed. For the most part, this has not been done in the past.

Ideally, the introduction of computing technology should be guided by a campus policy to have planned growth to support the educational and research goals of the university. However, most campuses have been adopting computing and network systems in a rather ad hoc manner; and in many cases, new systems are being introduced simply because of their availability from computer vendors. In some cases, new systems have been implemented because other campuses have already installed those systems. Are we being faced with the choice of computing or perishing?

As mentioned above, in spite of many attempts to use computers in higher education, progress has been rather slow. Generally speaking, the educational process is considered one of the least automated. Actual technology available is far more advanced than its educational applications. In one sense it seems that we are generating new systems and information much faster than we can develop the ability to use them. This is obvious from the waste resulting from poorly planned applications, or even wrong educational and research processes or tasks. And poorly developed systems and applications are often the subject of more abuse.

2.1 The Issue of Planning

A reasonable campus computing policy must address the issue of a planning and development process which involves a dedicated and well represented group to do something about it. Often a campus-wide planning committee ends up engaging in a great theoretical exercise. But there has to be, in addition, the leadership and determination to interact with faculty, staff, and students, and to plan and design a viable computational system within the university’s real educational and economic environment. Such a process of planning, analysis, design, and development should be an important element of the campus computing policy.

2.2 Conciliating opposing Viewpoints and Interests

The development of campus computing policies is as much a political endeavor as a technical task. The use of computers as an engineering phenomenon may be discussed more or less objectively. However, when we consider the potential value and impact of computing technology on our educational system, vested interests, personal values, beliefs and opinions enter into the discussion. Very few, if any, “objective” discussions are to be found in this area.

At one extreme, a technophobe may predict imminent difficulties, consequences, and possible disasters from over use of computing technology. At the other extreme, a technologist may predict that productivity can be greatly increased without any noticeable change in the organization of the campus community. How these opposing viewpoints may be reconciled is certainly a nontrivial issue.

2.3 Planning and Managing Change

Human action has always been augmented by tools and machines for producing and accomplishing desired end products or goals. Almost all human life involves technologies, including educational and recreational technologies. Technology consists of a system of strategies and tactics for the production of various kinds of products, both tangible and intangible. The use of computing technology is just a part of socio-cultural evolution, or perhaps, revolution. How to manage change is the central question.

Many individuals are afraid of change, since it will create an unknown which may alter their lives. There is a natural human tendency to associate tradition with truth and goodness, and any change in tradition with evil motives. Therefore, many computing policy discussions are rather emotional and are not based upon facts.

3.0 Organizational Changes and Their Potential Implications

Introduction of computers on a university campus may cause organizational and program changes. The policies which direct development and operation of computer systems will certainly affect these changes.

3.1 Organizational, Structural and Operational Changes

The most important computer policy issues are (1) the placement of a computing-related office in the organization, and (2) the way in which resources are to be allocated. The organizational structure often dictates technological approaches and usage patterns. The issues may include the selection of a centralized or distributed computing approach, the development of application systems and operational responsibilities, and user access controls.

The selection of a technological approach has strong organizational and operational implications. Centralized mainframes, distributed or dispersed workstations and personal computers will certainly have organizational impacts. The integration or separation of the management of computers and communication systems – including telephones, fax machines, video signals, etc. – is also a strong policy decision and has definite organizational impacts.

Management of user access patterns is also influenced by the organizational structures and the selection of technological approaches. Who can access what systems and what data files, and how usages are being charged, monitored, and controlled also have strong policy implications.

3.2 Potential Changes in Educational Programs

The computer-use policy on a campus not only provides a powerful new tool to enhance the existing educational system, it may also introduce changes in the educational system itself. Educational delivery systems may be drastically changed. The availability of a new tool may affect our way of problem-solving in both our educational and research activities. The program objectives and contents may be changed to prepare students for the forthcoming information age. The concept of the university may even be changed. How should a campus computing policy support or limit some of these institutional changes needs to be considered along with the educational programs themselves.

4.0 Conflicts Between the Individual’s Right to Privacy and the Organization’s Need to Process Information

The issue of confidentiality and invasion of privacy became increasingly prominent as the use of computers became widespread. Rapid proliferation of computers makes individuals increasingly vulnerable to abuse by computer experts. It is theoretically impossible to have a perfect protective system for data and system security. Although many effective security measures have been proposed, some of the safeguards that users want would be prohibitively expensive. There is a wide gap between the safeguards that can be implemented in the laboratory and those available in most commercial systems.

What is to be considered as appropriate, satisfactory, or a desired level of user and system protection is a serious subject of debate. Those who are responsible for operating computer systems are often more concerned with economic factors and efficiencies of machines. Moral, ethical, and legal responsibilities and liabilities must be carefully considered. However, requiring perfect or near perfect security solutions will certainly impede many educational applications where students and faculty can benefit. Computing policies should provide guidelines for the development of systems and operations which can provide an appropriate balance between these conflicts

5.0 Difficulties in Developing and Adopting Conventions and Standards

The proliferation of computer hardware and software systems often leads the campus to technical difficulties, and potential economic and operational disasters. The balance between system efficiency and application flexibility is an extremely difficult issue. The relatively short but rapid development of computing history is, probably, one of the reasons for the lack of strong industrial standards. Standardization is as much a political as it is a technical issue. This becomes particularly important when distributed and network computing systems are introduced.

Why, what, where, when, and how standards should be introduced and adopted are critical problems. Such a problem has not been recognized on most campuses today. Very little assistance can be offered by vendors. In fact, it is dangerous to consult them, since they may push for a special set of standards which directly or indirectly relate to their own products. Standards have a strong influence or may set a limitation on future adaptations and expansions of new technologies, on the development and use of certain types of hardware and software systems, on computer purchasing policies and procedures, and on the economics of systems development and operations. One must carefully consider how a campus computing policy should address the issue of developing and adopting appropriate conventions and standards.

6.0 Summary

The subject of a campus computing policy is so complex that no definitive statement could possibly be made at this time. It not only affects the operations and usages of computing and communications systems, but it has an enormous effect on the educational system itself. At present, there are very few, if any, explicit computing policy statements on American campuses.

The intricacy and difficulties of developing a computing policy have been clearly demonstrated by Leslie Burkholder’s presentation about the development of a Code of Computer Ethics on a university campus. My panel discussion is based on his presentation; and I am attempting, in a modest manner, to identify and explain some of the issues relevant to the development of a campus computing policy. Just a few obvious issues have been mentioned for the purpose of motivating more discussions during the panel session.

Computer technology itself will not create issues. It is the organization’s computing policy and its way of allocating resources which will determine the speed and nature of possible impacts. New tools are introduced for obtaining a desired benefit and the costs associated with such a decision must be considered. The impacts of adopting powerful computer tools are enormous and may cause institutional changes. It is my hope that this panel session will serve to clarify some issues and raise more questions – and explain some of the many dimensions of this complex and highly important subject.

University of Connecticut

Policy and Guidelines: Some Comments as the University of Delaware’s Draft Responsible Computing Policy Nears Approval

About four years ago, our department director called two of us into her office. One of our student consultants had broken into a computer at another university, securing root privileges for himself, using his work account and an account in the Computer Science Lab to do the deed. At the time, the University did not have any formal policy about computer crime, unauthorized access to computing resources, or responsible use of computing resources. As I write these prefatory notes [in 1991], it still does not. However, the “Policy for Responsible Computing Use at the University of Delaware” is nearing final approval: The policy has been on the agenda of the full Faculty Senate twice in the past seven months but has been referred back to committee both times. We anticipate its passage some time in 1992.

The attached “Recommended Guidelines for Responsible Computing at the University of Delaware” began as a 2-page draft computing ethics statement.2 Since 1987, the document has benefited from review by and input from faculty senators, senior vice presidents, interested faculty and students, deans, Computing Center staff, and colleagues at other universities and colleges. As the issues under discussion multiplied, we had to clarify the distinction between policy and procedures and implementation. Thus, we now have two documents before the University community:

  • a one-page policy requiring formal Faculty Senate approval
  • longer guidelines containing non-binding recommendations for implementing the policy.

The policy statement sets forth an ethical framework for computing use on our campus. It stresses that all users are responsible for the integrity of the computing and information resources and outlines who can authorize access to those resources. It defines “abuse” as unauthorized access or use of the University’s computing resources and outlines, in general terms, possible disciplinary actions. The policy tries to state things in positive terms, although the language is, at times, sterner than that suggested by some faculty members.

The longer, non-binding guidelines document will be issued by our department, Computing and Network Services (CNS), to help individual users, system administrators, and the general university community understand the implications of the policy and understand how the policy translates into action. CNS is soliciting input from users around the campus, but the University administration is not requiring that this document be put through a formal, campus-wide approval process. The current draft has ten sections.

  • The guidelines begin with a “Preface,” informally stating that all users are responsible for the well-being of the computing resources and stressing that an open network and a free exchange of ideas depend on everybody’s cooperation.
  • A second prefatory section defines some of the terms used in the document.
  • The body begins with a reprint of the actual one-page policy.
  • User responsibilities are outlined, stressing what one ought to do as opposed to stressing forbidden practices. This section also stresses user self-reliance and the supervisor’s role in teaching his or her staff or students good computing practices.
  • The section on system administrator responsibilities sets forth the proposition that in the ordinary course of events, “[a] system administrator’s use of the University’s computing resources is governed by the same guidelines as any other user’s computing activity.” It then offers general guidelines, particularly useful to new system administrators, for a system administrator’s additional responsibilities.
  • The section on misuse of computing resources lists examples of prohibited activity.
  • The section titled “User Confidentiality and System Integrity” attempts to answer a controversial pair of questions: When should a system administrator examine user information and what should he or she do about those situations in which he or she sees user information? This section is a direct result of the sometimes heated exchanges that have occurred between faculty members and system administrators.
  • The section on penalties for misuse of computing states that these matters need to be referred to the appropriate due process. It also reminds all parties that federal and state laws may apply.
  • The section on academic honesty, begins by paraphrasing a sentence from Brown University’s statement on Ethical and Responsible Computing: computer-assisted plagiarism is still plagiarism. Our Dean of Students endorsed this section very early in the review process.
  • The final section lists the works we have consulted as we prepared this draft. This section may be cut from the final version; however, it has helped us educate the campus and stimulate discussion of the points we make in the other sections.

It is easy to get people to agree to statements like “Don’t abuse computing resources.” But translating that sentiment into policy and then delineating the procedural implications of the policy can be difficult. Our task was complicated because we have been trying to develop one responsible computing policy that applies to the entire University. However, because the University is a relatively large organization,3 and because the computing resources on campus are “owned” in a number of different ways,4 we decided that no one set of implementation rules and procedures could meet all campus needs. Therefore, we recast our procedures document and called it “Recommended Guidelines for Responsible Computing.”

During our National Conference on Computing & Values working group’s first meeting, one participant wondered aloud why one needs to have a “computing ethics policy.” There are many arguments for such a policy; at the University of Delaware, we stressed four of them when we presented our draft responsible computing documents for review:

First, a policy for responsible computing defines who is authorized to grant access to resources and, therefore, defines what constitutes authorized and unauthorized access to a computing resource. Doing so also helps draw distinctions between access to the computer and access to information stored on the computer.

Second, a policy for responsible computing protects an organization, its computing resources, its clients or students, and its employees. By adopting such a policy, an organization outlines the rights and responsibilities of all parties involved – providing important legal protection for everybody.

Third, a responsible computing policy should emphasize that we are not inventing new rules for acceptable behavior as much as we are applying existing definitions of acceptable and unacceptable behavior to a new area. Most of our students, for example, do not need a reminder that it is wrong to tear pages from a book housed in our library or to take money from a neighboring dormitory room; however, many of our students do need to be reminded that copying software from a University lab or browsing mainframe directories for unprotected files is ordinarily not acceptable behavior.

Fourth and – in my opinion – most important for a university, a policy for responsible computing educates. If promulgated widely with additional training or supporting material, such a policy helps a university train its employees, faculty, and students about authorized access, permissible computing practices, and good computing and data management. This function helps the university itself and helps students prepare for the computing environments they will encounter after graduation. Furthermore, the review process itself can be educational because the application of “ethics” or “rules of conduct” to computing is a relatively new area of discussion on most campuses.

As a matter of fact, the review process to date has helped faculty and students learn more about system administrators’ points of view and has helped remind system administrators about faculty and student concerns. For example, from a system administrators point of view, many faculty and students have unrealistic expectations for the confidentiality of the information stored on a computer; from many faculty members’ point of view, too many system administrators are prying into areas that they ought not be looking into. That is, we have had to educate some users about the realities of working on a shared system. (For example, if one of your jobs threatens to kill other user jobs or crash a timesharing system, a system administrator must investigate.) We have also had to remind some system administrators that they must initiate notification procedures in those rare instances when a user’s information has been reviewed.

The review process has also taught us more about certain work relationships at our University. For instance, faculty members and academic staff learned some valuable lessons about the administrative point of view during our discussions about the policy statement’s opening sentences. As at many universities and colleges, anything that might possibly imply a curtailment of students’ and faculty’s “academic freedom” sets off alarms for many faculty members and students. Knowing that the effort to put a policy into effect could be construed as limiting academic freedom, one Faculty Senate committee recommended that the policy include language like the following: “The University of Delaware aims to provide the best possible computing and information resources to students, faculty, and staff and manages these resources in such a way that members of the University community can participate in an open exchange of ideas with each other, with colleagues at other universities, and with appropriate off-campus information resources.” This open approach requires that all members of the University community who use the University’s computing and information resources act cooperatively and responsibly. However, this language caused the University Treasurer to object on the grounds that his staff were bound more by rules of confidentiality than by rules of openness. Other University administrators agreed; since our goal is to have one all-encompassing policy, we changed the language to that in the accompanying draft guidelines. However, the preface to the guidelines still conveys the message that we are, for the most part, trying to maintain an “open” computing environment.

Finally, as we lead discussions about the draft policy and guidelines, we find that we are helping the University community learn more about computing and information technology in general. At first, a lot of faculty, students, and staff misunderstood the aims of the policy because they relied too heavily on analogies and inexact comparisons to “understand” computing technology. But as we discussed issues raised by the policy, our users learned more about, for example, how electronic mail really works, how one person’s work can affect other user’s work on a time-sharing system, and even why software piracy is wrong, even in the face of the argument, “But I didn’t steal it. You still have your copy. I just copied it.”

And so, the most important consequence of our efforts is that the campus is more aware of security issues, responsible computing practices, the relationships between the users of the resources, the relationships between the users and the providers, the relationships between the users and the resources themselves. And by having both a policy statement and a set of recommended guidelines, we have provided information about how the policy statement translates into user and system administrator actions.

If you are in the process of developing a responsible computing policy for your organization, we recommend that you consult the Site Security Handbook: RFC 1244, available in the computer file /pub/ssphwg/rfc1244.txt on cert.sei.cmu.edu. Released by the Internet Engineering Task Force in July 1991, this document provides a wealth of information with which you can educate senior decision-makers, faculty, staff, and students about responsible computing issues, data management, and computing security.

We hope that the accompanying draft document, “Recommended Guidelines for Responsible Computing at the University of Delaware,” will help other organizations discuss, develop, and implement policies and procedures for responsible computing at their own institutions.

University of Delaware

End Notes

  1. Some portions of these comments were presented at the National Conference on Computing & Values [NCCV] (Southern Connecticut State University) in an enrichment presentation entitled “‘Look What They’ve Done to My Policy, Ma: A Report on the Development of a Responsible Computing Policy at the University of Delaware” (August 13, 1991). In addition, the draft guidelines and an article about the approval process for our responsible computing policy will be appearing in a forthcoming issue of Computer Security Journal, published by the Computer Security Institute, 600 Harrison Street, San Francisco, California, 94107. A short adaptation of one section of the NCCV talk, on what we might have done differently if we were starting the process now, will be appearing in a forthcoming issue of Computer Security Alert, also published by the Computer Security Institute.
  2. Working with me at the time were Andrew Frake, now at Johns H

Recommended Guidelines for Responsible Computing at the University of Delaware

1.0 Preface

The computer has become a common denominator that knows no intellectual, political, or bureaucratic bounds; the Sherwin Williams of necessity that covers the world, spanning all points of view…. I wish that we lived in a golden age, where ethical behavior was assumed; where technically competent programmers respected the privacy of others; where we didn’t need locks on our computers…. Fears for security really do louse up the free flow of information. Science and social progress only take place in the open. The paranoia that hackers leave in their wake only stifles our work.
Cliff Stoll, The Cuckoo’s Egg: Tracking a Spy
Through the Maze of Computer Espionage

One of the interesting facets of Cliff Stoll’s The Cuckoo’s Egg is his growing awareness of the responsibilities all computer users have to each other. It is our hope that this set of Guidelines can foster that same understanding in the University of Delaware community.

It is imperative that all users of the University’s computing and information resources realize how much these resources require responsible behavior from all users. Simply put, we are all responsible for the well-being of the computing, network, and information resources we use.

Universities do try to promote the open exchange of ideas; however, an open, cooperative computing network can be vulnerable to abuse or misuse. As more and more schools, colleges, universities, businesses, government agencies, and other enterprises become attached to the worldwide computing and information networks, it is more important than ever that this University educate its students, faculty, and staff about proper ethical behavior, acceptable computing practices, and how “computer vandalism” interferes with the exchange of ideas that is integral to a modern education.

The first item in the body of this document is the University’s Policy for Responsible Computing Use, passed by the Faculty Senate of the University of Delaware on [date will go here]. The remainder of this document consists of recommended guidelines for implementing this policy. If you have any questions about the policy or the guidelines, please consult with your system administrator, with the staff in Computing and Network Services, or with your dean, project director, supervisor, chair, or adviser.

2.0 Definition of Terms

Administrative Officer: Vice-president, dean, chair, or director to whom an individual reports.

Computer Account: The combination of a user number, username, or user ID and a password that allows an individual access to a mainframe computer or some other shared computer.

Data Owner: The individual or department that can authorize access to information, data, or software and that is responsible for the integrity and accuracy of that information, data, or software. Specifically, the data owner can be the author of the information, data, or software or can be the individual or department that has negotiated a license for the University’s use of the information, data, or software.

Desktop Computers, Microcomputers, Advanced Workstations: Different classes of smaller computers, some shared, some single-user systems. If owned or leased by the University or if owned by an individual and connected to a University-owned, leased, or operated network, use of these computers is covered by the Policy for Responsible Computing Use.

Information Resources: In the context of these Guidelines, this phrase refers to data or information and the software and hardware that makes that data or information available to users.

Mainframe Computers: “Central” computers capable of use by several people at once. Also referred to as “time-sharing systems.”

Network: A group of computers and peripherals that share information electronically, typically connected to each other by either cable or satellite link.

Normal Resource Limits: The amount of disk space, memory, printing, etc. allocated to your computer account by that computer’s system administrator.

Peripherals: Special-purpose devices attached to a computer or computer network – for example, printers, scanners, plotters, etc.

Project Director: Person charged with administering a group of computer accounts and the computing resources used by the people using those computer accounts.

Server: A computer that contains information shared by other computers on a network.

Software: Programs, data, or information stored on magnetic media (tapes, disks, diskettes, cassettes, etc.). Usually used to refer to computer programs.

System Administrator: Staff employed by a central computing agency such as Computing and Network Services whose responsibilities include system, site, or network administration and staff employed by other University departments whose duties include system, site, or network administration. Note that if you have a computer on your desk, you may be considered that system’s system administrator.

3.0 Policy for Responsible Computing Use at The University of Delaware

In support of its mission of teaching, research, and public service, the University of Delaware provides access to computing and information resources for students, faculty, and staff, within institutional priorities and financial capabilities.

All members of the University community who use the University’s computing and information resources must act responsibly. Every user is responsible for the integrity of these resources. All users of University-owned or University-leased computing systems must respect the rights of other computing users, respect the integrity of the physical facilities and controls, and respect all pertinent license and contractual agreements. It is the policy of the University of Delaware that all members of its community act in accordance with these responsibilities, relevant laws and contractual obligations, and the highest standard of ethics.

Access to the University’s computing facilities is a privilege granted to University students, faculty, and staff. Access to University information resources may be granted by the owners of that information based on the owner’s judgment of the following factors: relevant laws and contractual obligations, the requestor’s need to know, the information’s sensitivity, and the risk of damage to or loss by the University.

The University reserves the right to limit, restrict, or extend computing privileges and access to its information resources. Data owners – whether departments, units, faculty, students, or staff – may allow individuals other than University faculty, staff, and students access to information for which they are responsible, so long as such access does not violate any license or contractual agreement; University policy; or any federal, state, county, or local law or ordinance.

Computing facilities and accounts are owned by the University and are to be used for the University-related activities for which they are assigned. University computing resources are not to be used for commercial purposes nor non-University-related activities without written authorization from the University. In these cases, the University will require payment of appropriate fees. This policy applies equally to all University-owned or University-leased computers.

Users and system administrators must all guard against abuses that disrupt or threaten the viability of all systems, including those at the University and those on networks to which the University’s systems are connected. Access to information resources without proper authorization from the data owner, unauthorized use of University computing facilities, and intentional corruption or misuse of information resources are direct violations of the University’s standards for conduct as outlined in the University of Delaware Policy Manual, the Personnel Policies and Procedures for Professional and Salaried Staff, the Faculty Handbook, and the Official Student Handbook and may also be considered civil or criminal offenses.

The University of Delaware treats access and use violations of computing facilities, equipment, software, information resources, networks, or privileges seriously. Disciplinary action resulting from such abuse may include the loss of computing privileges and other sanctions including non-reappointment, discharge, dismissal, and legal action – including prosecution under Title 11, §931 – §939 of the Delaware Code, the Computer Fraud and Abuse Act of 1986, or other appropriate laws. May 31, 1991

4.0 User Responsibilities

If you use the University’s computing resources or facilities, you have the following responsibilities:

  • Use the University’s computing facilities and information resources, including hardware, software, networks, and computer accounts, responsibly and appropriately, respecting the rights of other computing users and respecting all contractual and license agreements.
  • Use only those computers and computer accounts for which you have authorization .
  • Use mainframe accounts only for the purpose(s) for which they have been issued. Use University-owned microcomputers and advanced workstations for University-related projects only.
  • Be responsible for all use of your accounts and for protecting each account’s password. In other words, do not share computer accounts. If someone else learns your password, you must change it.
  • Report unauthorized use of your accounts to your project director, instructor, supervisor, system administrator, or other appropriate University authority.
  • Cooperate with system administrator requests for information about computing activities. Under certain unusual circumstances, a system administrator is authorized to access your computer files.
  • Take reasonable and appropriate steps to see that all hardware and software license agreements are faithfully executed on any system, network, or server that you operate.

Each user is ultimately responsible for his or her own computing and his or her own work using a computer. Take this responsibility seriously. For example, users should remember to make backup copies of their data, files, programs, diskettes, and tapes, particularly those created on microcomputers and those used on individually – or departmentally – operated systems. Furthermore, users with desktop computers or other computers that they operate themselves must remember that they may be acting as the system administrators for those computers and need to take that responsibility very seriously.

If you are a project director for a group of mainframe computing users, a supervisor whose staff use computers, or a faculty member whose students use computers, you must help your project members, staff, or students learn more about ethical computing practices. You should also help your project members, staff, or students learn about good computing practices and data management.

5.0 System Administrator Responsibilities

This document uses the phrase system administrator to refer to all of the following University personnel:

  • Staff employed by a central computing agency such as Computing and Network Services whose responsibilities include system, site, or network administration.
  • Staff employed by other University departments whose duties include system, site, or network administration.

A system administrator’s use of the University’s computing resources is governed by the same guidelines as any other user’s computing activity. However, a system administrator has additional responsibilities to the users of the network, site, system, or systems he or she administers:

  • A system administrator manages systems, networks, and servers to provide available software and hardware to users or their University computing.
  • A system administrator is responsible for the security of a system, network, or server.
  • A system administrator must take reasonable and appropriate steps to see that all hardware and software license agreements are faithfully executed on all systems, networks, and servers for which he or she has responsibility.
  • A system administrator must take reasonable precautions to guard against corruption of data or software or damage to hardware or facilities.2
  • A system administrator must treat information about and information stored by the system’s users as confidential.

As an aid to a better understanding of responsible computing practices, all departments that own or lease computing equipment are encouraged to develop “Conditions Of Use” documentation for all systems that they operate and to make these “Conditions Of Use” documents available to users. These documents should be consistent with the University of Delaware Policy for Responsible Computing Use (reprinted on page 1 of these Guidelines) and should be approved by the department’s administrative officer or other individual designated by that administrative officer.

6.0 Misuse of Computing and Information Resource Privileges

The University characterizes misuse of computing and information resources and privileges as unethical and unacceptable and as just cause for taking disciplinary action. Misuse of computing and information resources and privileges includes, but is not restricted to, the following:

  • Attempting to modify or remove computer equipment, software, or peripherals without proper authorization
  • Accessing computers, computer software, computer data or information, or networks without proper authorization, regardless of whether the computer, software, data, information, or network in question is owned by the University (That is, if you abuse the networks to which the University belongs or the computers at other sites connected to those networks, the University will treat this matter as an abuse of your University of Delaware computing privileges.)
  • Circumventing or attempting to circumvent normal resource limits, lagoon procedures, and security regulations
  • Using computing facilities, computer accounts, or computer data for purposes other than those for which they were intended or authorized
  • Sending fraudulent computer mail, breaking into another user’s electronic mailbox, or reading someone else’s electronic mail without his or her permission
  • Sending any fraudulent electronic transmission, including but not limited to fraudulent requests for confidential information, fraudulent submission of electronic purchase requisitions or journal vouchers, and fraudulent electronic authorization of purchase requisitions or journal vouchers
  • Violating any software license agreement or copyright, including copying or redistributing copyrighted computer software, data, or reports without proper, recorded authorization
  • Violating the property rights of copyright holders who are in possession of computer-generated data, reports, or software
  • Harassing or threatening other users or interfering with their access to the University’s computing facilities
  • Taking advantage of another user’s naivetÈ or negligence to gain access to any computer account, data, software, or file other than your own
  • Encroaching on others’ use of the University’s computers (e.g., disrupting others’ computer use by excessive game playing; sending frivolous or excessive messages, either locally or off-campus; printing excess copies of documents, files, data, or programs; modifying system facilities, operating systems, or disk partitions; attempting to crash or tie up a University computer; damaging or vandalizing University computing facilities, equipment, software, or computer files)
  • Disclosing or removing proprietary information, software, printed output or magnetic media without the explicit permission of the owner
  • Reading other users’ data, information, files, or programs on a display screen, as printed output, or via electronic means, without the owner’s explicit permission.

7.0 User Confidentiality and System Integrity

If a system administrator is an eyewitness to a computing abuse; notices an unusual degradation of service or other aberrant behavior on the system, network, or server for which he or she is responsible; or receives a complaint of computing abuse or degradation of service, he or she should investigate and take steps to maintain the integrity of the system(s). If a system administrator has evidence that leads to a user’s computing activity as the probable source of a problem or abuse under investigation, he or she must weigh the potential danger to the system and its users against the confidentiality of that user’s information.

While investigating a suspected abuse of computing; a suspected hardware failure; a disruption of service; or a suspected bug in an application program, compiler, network, operating system, or system utility, a system administrator should ordinarily ask a user’s permission before inspecting that user’s files, diskettes, or tapes. The next two paragraphs outline exceptions to this rule.

If, in the best judgment of the system administrator, the action of one user threatens other users or if a system or network for which the system administrator is responsible is in grave, imminent danger of crashing, sustaining damage to its hardware or software, or sustaining damage to user jobs, the system administrator should act quickly to protect the system and its users. In the event that he or she has had to inspect user files in the pursuit of this important responsibility, he or she must notify, as soon as possible, his or her own administrative officer or other individual designated by that administrative officer of his or her action and the reasons for taking that action. The administrative officer needs to be certain that one of the following are also notified: the user or users whose files were inspected; the user’s supervisor, project director, administrative officer, or academic advisor. It is a departmental responsibility that this notification occur, not a personal responsibility of the system administrator.

In cases in which the user is not available in a timely fashion, in which the user is suspected of malicious intent to damage a computer system, or in which notifying the user would impede a sensitive investigation of serious computer abuse, the system administrator may inspect the information in question so long as he notifies his or her own administrative officer or other individual designated by the administrative officer of his or her actions and the reasons for taking those actions. The administrative officer needs to be certain that the user’s supervisor, project director, administrative officer, or academic advisor is notified of the situation. In the case of suspected malicious intent, the administrative officer may also need to refer the matter to the appropriate University judicial body or to the Department of Public Safety.

A system administrator may find it necessary to suspend or restrict a user’s computing privileges during the investigation of a problem. The system administrator should confer with his or her administrative officer or other person designated by that administrative officer before taking this step. A user may appeal such a suspension or restriction and petition for reinstatement of computing privileges through the University’s judicial system, through the grievance procedures outlined in the faculty collective bargaining agreement, or by petition to the Dean of Students.

In general, then, a system administrator should:

  • protect the integrity of the system entrusted to his or her care
  • respect the confidentiality of the information users have stored on the system
  • notify appropriate individuals when the above two aims have come into conflict assist his or her administrative officer in referring cases of suspected abuse to the appropriate University judicial process.

8.0 Penalties for Misuse of Computing and Information Resource Privileges

Abuse of computing privileges is subject to disciplinary action. If system administrators or staff in the Department of Public Safety have a preponderance of evidence that intentional or malicious misuse of computing resources has occurred, and if that evidence points to the computing activities or the computer files of an individual, they have the obligation to pursue any or all of the following steps to protect the user community:

  • Notify the user’s project director, instructor, academic advisor, or administrative officer of the investigation.
  • Refer the matter for processing through the University’s judicial system. If necessary, staff members from a central computing agency such as Computing and Network Services as well as faculty members with computing expertise may be called upon to advise the University judicial officers on the implications of the evidence presented and, in the event of a finding of guilt, of the seriousness of the offense.
  • Suspend or restrict the user’s computing privileges during the investigation. A user may appeal such a suspension or restriction and petition for reinstatement of computing privileges through the University’s judicial system, through the grievance procedures outlined in the faculty collective bargaining agreement, or by petition to the Dean of Students.
  • Inspect that user’s files, diskettes, and/or tapes. System administrators must be certain that the trail of evidence leads to the user’s computing activities or computing files before inspecting the user’s files.

Ordinarily, the administrative officer whose department is responsible for the computing system on which the alleged misuse occurred should initiate proceedings. As the case develops, other administrative officers may, by mutual agreement, assume the responsibility for prosecuting the case.

Disciplinary action may include the loss of computing privileges and other disciplinary sanctions up to and including F, discharge, dismissal, and legal action. In some cases, an abuser of the University’s computing resources may also be liable for civil or criminal prosecution.

It should be understood that nothing in these guidelines precludes enforcement under the laws and regulations of the State of Delaware, any municipality or county therein, and/or the United States of America. For example, if you are found guilty of committing a computer crime as outlined in Title 11 §932 – §936 of the Delaware Code, you could be subject to the penalties for a class B felony.

9.0 Academic Honesty

Faculty and students are reminded that computer-assisted plagiarism is still plagiarism. Unless specifically authorized by a class instructor, all of the following uses of a computer are violations of the University’s guidelines for academic honesty and are punishable as acts of plagiarism:

  • copying a computer file that contains another student’s assignment and submitting it as your own work
  • copying a computer file that contains another students assignment and using it as a model for your own assignment
  • working together on an assignment, sharing the computer files or programs involved, and then submitting individual copies of the assignment as your own individual work
  • knowingly allowing another student to copy or use one of your computer files and to submit that file, or a modification thereof, as his or her individual work.

For further information on this topic, students are urged to consult the University of Delaware Official Student Handbook, to consult with their individual instructors, and to refer to the pamphlet “Academic Honesty & Dishonesty: Important information for faculty and students.”

Faculty members are urged to develop specific policies regarding all aspects of academic honesty and to communicate those policies to their students in writing.

10.0 Works Consulted

Charles Augustine, “The Pieces of a Policy: Categories for Creation of a Computer Ethics Policy,” Capitalizing on Communication: Proceedings of ACM SIGUCCS User Services Conference XVII, 1989.

Baylor University, Computer Policies, 1989. (Copy located in the computer file ethics/Baylor.policy on ariel.unm.edu.)

Catholic University of America, Statement of Ethics in the Use of Computers. 1988. [Reprinted in ACM SIGUCCS Newsletter, Volume 19, Number 1. 1989.]

Gary Chapman, “CPSR [Computer Professionals for Social Responsibility] Statement on the Computer Virus,” Communications of the ACM, Volume 32, Number 6. 1989.

Colgate University, Agreement for Use of Computing Facilities, 1989. (Copy located in the computer file ethics/ColgateU.policy onariel.unm. edu.)

Columbia University, Administrative Policies of the Center for Computing Activities, [no date]. (Copy located in the computer file ethics/ ColumbiaU.policy on ariel.unm.edu.)

Corporation for Research and Educational Networking, “Acceptable Use of CSNET and BITNET,” 1990. (Received via electronic mail from Bernard A. Galler, March 23, 1990.)

Delaware Code (Annotated). Computer-Related Offenses. Title 11, §931 §939. 1987.

Delaware Code (Annotated), 1989 Supplement. Computer-Related Offenses. Title 11, §937. 1989.

EDUCOM and ADAPSO, Using Software: A guide to the ethical and legal use of software for members of the academic community, EDUCOM, 1987.

Mark W. Eichin and Jon A. Rochlis, “With Microscope and Tweezers: An Analysis of the Internet Virus of November 1988.” Paper presented at 1989 IEEE Symposium on Research in Security and Privacy. (Copy located in the file pub/virus/mit.PS on bitsy.mit.edu.)

David M. Ermann, Mary B. Williams, and Claudio Gutierrez, eds., Computers, Ethics, and Society, Oxford University Press, 1990.

Faculty Senate of the University of Delaware. Ethetical [sic] Conduct in Computing. Unpublished draft statement discussed by Faculty Senate in 1989.

David J. Farber, “NSF [National Science Foundation] Poses Code of Networking Ethics,” Communications of the ACM, Volume 32, Number 6. 1989.

Fraser Valley College, DRAFT: Fraser Valley College Computing and Ethics Policy, April 23, 1991. (Copy received via electronic mail, April 24, 1991, from Paul Herman, Fraser Valley College.)

Katie Hafner and John Markoff, Cyberpunk: Outlaws and Hackers on the Computer Frontier, Simon and Schuster, 1991.

W. Michael Hoffman and Jennifer Mills Moore, eds., Ethics and the Management of Computer Technology: Proceedings of the Fourth National Conference on Business Ethics Sponsored by the Center for Business Ethics, Bentley College, Oelgeschlager, Gunn, and Hain,1982.

Indiana University, Academic Computing Policy Committee, Subcommittee on Ethical Use of Computers, “Computer Users’ Privileges and Responsibilities: Indiana University,” 1990. (Copy received via electronic mail April 25, 1990, from Mark Sheehan, Indiana University Computing Services.)

Internet Activities Board, Ethics Policy Statement, 1988. [Reprinted in Purdue University’s PUCC Newsletter, March 1989.]

Internet Engineering Task Force, Site Security Handbook: RFC 1244. P. Holbrook and J. Reynolds, eds. July 1991. (Copy located in the file pub/ssphwg/rfc1244.txt on cert.sei.cmu.edu.)

Deborah G. Johnson, Computer Ethics, Prentice-Hall, 1985.

John Lees, [Michigan State University] College of Engineering Computer Use Policy – DRAFT, 1990. (Received via electronic mail April 23, 1990, from John Lees.)

Margaret Loy Mason, “Ethics & Electronic Communication: An Adventure in User Education,” New Centerings in Computing Services: Proceedings of ACM SIGUCCS User Services Conference XVIII, 1990.

National Science Foundation, “NSFNET Interim Conditions of Use Policy,” Link Letter, Volume 3, Number 3, 1990. (Also available in the file nsfneVnetuse.txt on nis.nsf.net.)

Donn B. Parker, Susan Swope and Bruce N. Baker, Ethical Conflicts in Information and Computer Science, Technology, and Business, QED Information Sciences, 1990.

Jane N. Ryland, “Security – A Sleeper Issue Comes into its Own,” Cause/Effect, Volume 12, Number 4, 1989.

Software Publishers Association, Software Use and the Law: A guide for individuals, educational institutions, user groups, and corporations,[no date].

Eugene H. Spafford, “Some Musings on Ethics and Computer Break-lns,” 1989. (Copy located in the file pub/virus/spaf.PS.Z on bitsy.mit.edu.)

Cliff Stoll, The Cuckoo’s Egg: Tracking a spy through the maze of computer espionage, Doubleday, 1989.

Syracuse University, Computer Use Policy, [no date].

Temple University, Rules of Conduct for Using Computing Resources at Temple University, 1988.

University of Delaware, Academic Honesty & Dishonesty: Important information for faculty and students, 1989.

University of Delaware, “Code of Conduct,” Official Student Handbook, 1989.

University of Delaware, “Code of Ethics,” Personnel Policies and Procedures for Professional and Salaried Staff, 1989.

University of Delaware, Computer Software, University of Delaware Policy Manual, Policy 6 – 9, 1989.

University of Delaware, Misconduct in Research, University of Delaware

Policy Manual, Policy 6 – 11, 1989.

University of Delaware, University of Delaware Faculty Handbook, 1990 [online edition consulted].

University of Delaware, 1989 – 1990 Residence Halls Handbook, 1989.

University of Delaware Libraries, Circulation Procedures and Services, [no date].

University of Michigan – Ann Arbor, Think About It: The Proper Use of Information Resources, Information Technology, and Networks at the University of Michigan, [no date].

University of New Mexico, UNM Ethics Code for Computer Use [Draft], 1989. (Copy located in the computer file ethics/UofNewMexico.policy on ariel.unm.edu.)

Ronald F. E. Weissman, “Ethical and Responsible Computing,” The Open Window (Brown University), Volume 3, Number 1, 1989. [Cited in Ryland’s article.]

End Notes

  1. The software made available by the University has been licensed by the University for your use. As a result, its use may be subject to certain limitations.
  2. The University is not responsible for loss of information from computing misuse, malfunction of computing hardware, malfunction of computing software, or external contamination of data or programs. The staff in central computing units such as Computing and Network Services and all other system administrators must make every effort to ensure the integrity of the University’s computer systems and the information stored thereon. However, users must be aware that no security or back-up system is 100.00% foolproof.

The Ethics of Evaluating Instructional Computing

Leslie Burkholder’s address demonstrates the wide scope of problems, questions, and issues that characterize the realm of academic computing. In the following pages, the focus will be upon issues related to one of the questions he cites. “May a researcher, for example, look at files recording a student’s revisions of his essays, without that student’s permission, in order to complete a study of how a writing tool is used?”(1) My interest in this sort of question derives from the direction in which it leads and the points that it makes explicit about academic computing. It reminds us that instructional computing is a growing part of the academic use of computers. Moreover, the use of computers in academia supports not only the teaching effort but also, in a great variety of ways, the conduct of empirical research. In order to address the ethical questions that arise when computers are at the center of both instructional and research efforts, a case analysis approach will be used. The case to be analyzed is a hypothetical one concerning evaluative research on the use of computer-assisted instruction programs. This hypothetical case involves an instructor carrying out a controlled experiment in which students in a large class were randomly assigned to one of two conditions: traditional classroom instruction versus that same instruction supplemented by CAI. One group of students thus had access to instructional computer programs while the other (the control group) did not. Students in the control group discovered that the grades of students using the CAI were higher than theirs.

By the middle of the term the instructor realized that students in the experimental group who had access to CAI were doing much better than students in the control group. Some students in the control group sensed this difference and complained that they were being denied an educational opportunity which others who had paid the same tuition were getting. These students insisted that the instructor discontinue the experiment and allow them to use the CAI package for the remainder of the term. As one student emphatically put it, “I’m not a guinea pig!”(2)

The ethical difficulties bound up in this attempt to evaluate CAI programs are nicely illustrated in previously published comments on this case given by Jim Moor and Christine Overall. After presenting their commentary and my response to it, some remarks will be made about the ethical considerations which are significant here.

1.0 Moor’s Analysis

Moor analyzes this case in terms of a conflict between the need for educational research and the rights of students. This conflict is made explicit by means of the following argument in which the first premise represents “the instructor’s position” while the second premise represents “the student’s position.”

  1. In order to properly evaluate innovative instructional methods, complete and carefully controlled experiments involving students must be performed.
  2. If complete and carefully controlled experiments involving students are performed, then some students are treated unfairly without their consent.
  3. Consequently, either innovative educational methods are not properly evaluated or some students are treated unfairly without their consent.

Developers of CAI are thus faced with a choice between two undesirable alternatives. They must repudiate either their responsibility to develop sound pedagogy or their responsibility to treat students fairly. After acknowledging that experimental studies are not necessary for many types of program evaluation and are often extremely difficult to execute, Moor concludes that “at some point in the development of CAI only controlled experiment… can determine the real effectiveness of CAI.”(3) Given this support for the instructor’s position, the question is whether the adverse consequences predicted for students in the second premise actually occur. This leads to a consideration of the “students’ position.” This position is presented as centering around two ethical issues: unequal educational opportunities and lack of consent. In respect to the first issue, students believe that they have a right to equal educational opportunities such that no student should be deprived in this regard. Nevertheless, Moor points out the limitations of this view.

Different and even unequal educational opportunities do not automatically indicate a violation of student rights…. An instructor can offer some different educational opportunities without moral consequences. For example, an instructor ethically may use different textbooks in different sections. Or, trivially, an instructor may help one student, whom the instructor happens to meet in the hall, without being obligated to help other students in the same way.(4)

An important component of this analysis is Moor’s belief that students in the control group are receiving normal and adequate instruction and that an instructor is obligated to provide no more than this. Once this obligation is fulfilled, the necessity for requesting student consent is questionable. It is the instructor’s prerogative, not the student’s, to select instructional methods. Normally, experiments in education differ from medical experiments which involve deception and possible harm from particular treatment conditions. Consequently, “students should be informed about what will happen during the term.”(5) But, further consent or approval from students is not required.

Moor does agree, however, with the students’ claim that “some inherent injustice occurs when an instructor arbitrarily divides a class so that (as far as the instructor has good reason to believe, one half receives substantially more educational assistance than the other half even though both halves receive at least a normal amount of instruction.”6 In order to resolve the conflict bound up in this injustice and the need for carrying out controlled studies, Moor proposes that “final grades of the control group (experimental group) should be raised at the end of the course if the experimental group (control group) does significantly better and additional education should be offered at no charge to make up for any deficits in learning incurred because of the experiment.”(7) Hereafter, this proposal will be referred to as “Moor’s resolution.”

2.0 Overall’s Analysis

Overall’s analysis produces a much less favorable view of this type of in-class educational research. Much of her position is summed up in the following passage.

  • Students are entitled to expect that the available resources and unavoidable liabilities will be distributed similarly within the two sections of what purports to be the same course. They should not be expected and required, as members of a “control group,” to sacrifice access to educational technologies for the sake of possible gains in knowledge about the technologies’ effectiveness – knowledge which will benefit not them but only some hypothetical future students. Students ought not to be harmed by the accident of being enrolled in one section rather than another; they should not be deprived, without their knowledge or consent, of a better learning environment enjoyed by their peers in the other section.(8)

It is clear that, contra Moor, Overall takes very seriously the possibility of students being harmed by the experiment. This point underscores the need for informed consent since “without adequate advance information, students are denied any real choices about this course, and thus have been unjustly treated.”9 Overall also proposes that the non-CAI students be given both access to the instructional programs and remedial instruction if necessary.

While Moor believes that an instructor is ethically bound only to provide adequate instruction, Overall requires more than this. “In offering a university course, the instructor undertakes to provide an optimum learning environment for her students.”(10) This point is related to the issue of whether an instructor is obligated to treat groups within the same course equally and to student claims concerning their rights to equal instruction for equal pay.

Distasteful as it may seem to some, students are the consumers of their education: they want value for their time and money, and they do not want to be deprived of products or services that other students may be receiving for exactly the same fees. Nor should this approach to education necessarily be seen as immature, inappropriate, materialistic, or immoral. If we assume that, as instructors, we have a responsibility to provide our students with the best instruction we possibly can, then the students should not be condemned for expecting to receive that instruction.(11)

In light of this responsibility, Overall recommends that controlled, experimental evaluation of CAI be altogether avoided. Rather, she proposes that the performance of students using CAI in a given course be compared with that of students in previous semesters prior to the introduction of the CAI. This procedure (hereafter “Overall’s resolution”) should allow instructors to carry out their obligations toward students while also developing innovative instructional techniques in a responsible manner.

3.0 Considerations of these Analyses

In providing their analyses, Moor and Overall make a number of recommendations. Some of these are directed to the particular hypothetical case described and specify how the instructor caught in this mid-semester predicament should respond. Other recommendations move beyond this question and address general issues of evaluating CAI. These are the main concern here and it is clear that this hypothetical case highlights ethical issues involved in instituting and assessing educational innovations in general. In one important respect, however, the case analyzed is not representative of controlled evaluations of CAI nor of assessments of educational innovation in general. In the hypothetical case, the superiority of the experimental treatment is presented as a given. In actual cases of evaluating CAI, the superiority of CAI over traditional methods is in doubt. Indeed, many empirical studies attempting to demonstrate that superiority have failed to do so. Consequently, the efficacy of these programs should not be assumed when discussing the ways in which students may be deprived or harmed by controlled evaluations of CAI. It is not clear to what extent this assumption underlies either Moor’s or Overall’s conclusion about CAI evaluation in general. Moor’s parenthetical qualification does seem crucial since his resolution is designed to address the “inherent injustice [which] occurs when an instructor arbitrarily divides a class so that (as the instructor may have good reasons to believe) one half receives substantially more educational assistance than the other half even though both halves receive at least a normal amount of instruction.”(12) This raises the question of whether Moor’s resolution would be required if the instructor did not have good reason to believe that the control group was being disadvantaged. A similar question can be raised about Overall’s recommendation to compare courses before and after the introduction of CAI, if it is based on the assumption that students are automatically making sacrifices, being harmed, or being deprived when assigned to control groups. In any event, the issue to be addressed here concerns the responsibilities and rights of instructors and students in the course of evaluating techniques the effectiveness of which is genuinely in doubt.

This issue is important because students have a right to effective instruction and they should not be forcefully grouped in ways that preclude members of different groups from receiving equally effective instruction. The concept of equally effective instruction underlies the appeal to equal educational opportunities in this context and has an important relationship to the instructor’s responsibility for providing optimum instruction. In order to elaborate this point, further consideration will be given to the instructor’s responsibility for (1) ensuring equal educational opportunities, (2) providing optimum instruction, and (3) respecting the requirement of informed consent.

3.1 Equal Educational Opportunity

Moor’s discussion of equal educational opportunities is aimed at assessing the view that an instructor’s responsibilities in this regard would preclude assigning students to a control group. This discussion, however, would benefit from a distinction between opportunities and actual practices or achievements. Opportunities are mere possibilities for attaining some desired state. They provide favorable conditions for that attainment but give no guarantee that the desired outcome will occur in practice. Moor’s contention that an instructor may ethically offer different educational opportunities to different students is illustrated by an example of helping a student accidentally met in the hall while not helping other students in the same way. But this does not seem to be a case of offering different educational opportunities. Supposedly, the instructor would provide assistance to any student who came seeking it. Moor, however, suggests that an instructor who helps one student in this way is not obligated to help other students in the same way. This suggestion seems to be false if the issue concerns educational opportunities as opposed to practices. Helping one student in the hall does obligate the instructor to help other students in that same way, if the occasion arises. It does not mean that the instructor must take action to ensure that each student is actually helped in that same way. In sum, the instructor is obligated to provide equal educational opportunities which, in practice, may be taken advantage of by some students but not by others.

Nor is it clear that using different textbooks in different sections of a course is a case of providing different educational opportunities. Moor uses this example to reinforce his view that providing different educational opportunities may be morally permissible. However, instructors often have a general concept of what is to be learned in a course while having several different means of instantiating that concept, and using different textbooks may comprise one of several different practices that offer the same educational opportunities.(13) They are not necessarily providing different educational opportunities when they do so. The student demand for equal treatment is best understood here as a demand for equally effective instruction. Providing equally effective instruction is one way of providing equal educational opportunities. Equally effective instruction may be provided, in practice, by different means and may in fact require that some students be treated differently than others. So, the focal point here is not equal treatment but rather equally effective instruction as a means of providing equal educational opportunities. And it does appear that instructors have a responsibility to provide this.

3.2 Optimum Instruction

Another ground for rejecting the appropriateness of subjecting students to different treatment conditions in the classroom concerns the responsibility for providing optimum instruction. Overall affirms this responsibility while Moor speaks merely of providing “adequate” instruction. Nevertheless, it is not clear that accepting this responsibility precludes controlled studies of the sort considered here. Conversely, it may be that a commitment to optimum instruction requires rather than prohibits controlled evaluation studies in the classroom. This possibility is suggested by the view that what does or does not constitute optimum instruction is empirically determined. That is, a commitment to optimum instruction, unless hollow, is a commitment to gathering empirical evidence in the classroom. The question is whether that evidence must come in the form of controlled experiments. Overall’s resolution maintains that there exists an adequate alternative.

The value of the CAI programs could perhaps more easily and certainly more fairly have been assessed by providing them for both sections of the logic course, and then comparing their outcomes with those of students enrolled in the logic course in past years. There is no reason to suppose that this year’s crop of students is appreciably different from those of previous years so the instructor has adequate evidence to permit her to evaluate the differences in student achievement produced through the use of CAI programs.(14)

Overall believes that this sequential approach is more fair than an approach which assigns students to a control group. Her resolution, however, is not completely without problems. On the one hand, it is true that current students have not been deprived of what Overall terms “an apparent educational benefit.” But given the genuine possibility that the CAI treatment may be of no value or even counterproductive, this advantage is of limited significance. In fact, the current students have been subjected to the same risk endured by experimental subjects in a controlled study, and Moor’s resolution may be required should current students underperform previous students.

There is also some methodological difficulty with Overall’s resolution. Students can in fact differ from one semester to the next in respect to their ability to learn a particular subject. Consequently, a simple comparison of one semester’s students with those from past years, can easily lead to a faulty conclusion concerning the efficacy of a pedagogical innovation. A dedicated instructor, however, might track student performance for several semesters and then do the same for several semesters following some innovation.(15) In addition, it would be helpful to develop a pretest or to establish a high correlation between some measure such as SAT scores and the ability to learn the subject being taught. These measures could document the similarity of the groups compared. While this would be an improvement, this approach, according to accepted cannons of research methodology, would still be more liable to error than an experimental control group study. This raises the question of how reliable the evidence must be when evaluating instructional techniques. In general, the need for reliable evidence increases as does the extent to which well established practices are modified. An innovation which radically changes the instructional process requires thorough evaluation. CAI programs vary in this respect. Some programs function merely as homework exercises and provide automatic grading while others alter the patterns and degrees of student-teacher interaction. Judgments concerning the reliability of the required evidence should be made by particular instructors considering particular programs and innovations. In any event, a commitment to optimum instruction does not rule out controlled evaluations nor does it establish the acceptability of other alternatives.(16) To provide optimum instruction is to do the best we know how to for students. Characterized in this way, the epistemological burden is obvious particularly since our understanding of effective instruction is continuously evolving. Determining what constitutes optimum instruction is best accomplished by empirical studies over long periods of time. The general nature of the research required bears on certain claims elaborated by Overall. Specifically, the student view that participating in controlled studies “will benefit not them but only some hypothetical future students” is worth comment here.

Educational improvements are best achieved by long term, sustained processes of change involving feedback mechanisms for the evaluation of innovative methods. These improvements are the products of long run commitments to what is basically a process of guided trial and error. One consequence is that, over time, some students are treated differently than others and some students receive better instruction than others. While this consequence does not in itself legitimize the different treatments required by controlled studies, it is germane to the dim view of “hypothetical, future students” taken in the students’ position. The quality of education enjoyed by current students has grown out of years of trial and error. Students cannot expect quality education without acknowledging the contributions of previous students (even though few of those students served as subjects in controlled experiments). Nor can today’s students justifiably demand optimum instruction without accepting their role in that process of trial and error. Students are not, by mere virtue of this role, obligated to serve as subjects in controlled experiments. But by the same token, the fact that only hypothetical future students may benefit from the experiment carries little or no weight in the assessment of that obligation.

Another component of the students’ position worth comment is that which is based upon their role as consumers of education. No student wants to be deprived of products and services available to others when all students pay the same fees. Nevertheless, the view of students as consumers of education is misleading when determining their rights in the context of controlled studies in the classroom. It is true that when selecting an institution of higher education, students, usually in conjunction with their parents, often think and behave as consumers. They think in terms of what will be received for what is paid. They make comparisons of campus living conditions, the surrounding environment, characteristics of students and faculty, ratings of various programs, etc. Nevertheless, they are not acting in a consumer mode when they “pay” for their education. An education is not something which can be bought. It is not a product which can be purchased (like a house or an automobile). Nor is it straightforwardly a service (like having one’s yard mowed or one’s car washed). Normally, if one pays for a product or service and gets nothing in return, one has good grounds for complaint. But it is commonplace in higher education for a student to pay tuition, learn nothing, receive a grade of ’F’ with no academic credit, and yet have no grounds for complaint. In paying tuition the student has not purchased credit hours or a grade in a course. What the student has paid for is an opportunity to be instructed, an opportunity that brings with it numerous responsibilities. To receive academic credit the student must perform assigned tasks, fulfill course requirements, and ultimately submit to evaluation by the instructor. Rarely, if ever, does the purchase of a product or service conform to this standard. In sum, the rights which students possess in the context of educational research do not follow from their role as consumers. That some students should not be favored in the classroom while others are disadvantaged follows, not from the fact that students have made equal payments, but from other principles concerning respect for individuals and the aims of the educational process. Fostering a respect for persons and, in turn, treating students as autonomous moral agents worthy of such respect are crucial elements of those aims. Were education to be provided free of charge, these rights would be guaranteed by virtue of those objectives.

These remarks indicate that claims about the aims of education can serve as a basis for making certain ethical judgments. An example of this is provided by Strike and Soltis (1985) in The Ethics of Teaching. In the following passage, these authors elaborate on what they see as an important educational objective:

  • In our view, growth as a moral agent, as someone who cares about others and is willing and able to accept responsibility for one’s self, is the compelling matter. Promoting this kind of development is what teachers ought to be fundamentally about, whatever else it is that they are about. We are first and foremost in the business of creating persons. It is our first duty to respect the dignity and value of our students and to help them to achieve their status as free, rational, and feeling moral agents.(17)

This goal is certainly laudable, but it is curious in one respect. There are no courses designed specifically to attain it. There is no course listed in the university catalog entitled “Achieving One’s Status as a Free, Rational, and Feeling Moral Agent.” If there is no course designed to fulfill such an important goal, how are students expected to achieve this status? The answer is that the goal is achieved by means of a variety of experiences which occur both inside and outside of the classroom. Student-teacher interaction is a crucial component of these experiences. Being treated with respect, as an end in oneself rather than as a means to something else, contributes to the development of a sense of worth and responsibility. Being treated as someone who is incapable of making rational decisions and who may legitimately be forced into a role extraneous to that of learner in the midst of one’s education is not conducive to the development of persons.

This line of reasoning makes explicit the link between claims about educational aims and particular ethical judgments. Nevertheless, it does not tell the whole story. Educational aims are diverse, and this diversity may lead to inconsistent implications for particular judgments. For example, education aims not only at producing persons in the sense intended by Strike and Soltis, but also at transmitting a cultural heritage and effectively imparting various elements of human knowledge. Emphasis upon this latter goal might lead to different judgments concerning the necessity for in-class experimentation and the appropriate roles of students and teachers in that context. One approach to avoiding such inconsistencies is to impose a hierarchy upon the diversity of educational aims. This is effectively what Strike and Soltis accomplish by their insistence that “we are first and foremost in the business of creating persons” (emphasis added). In any event, the point of these remarks is that statements of educational aims have a normative force and that they have logical implications for particular ethical judgments in education. In particular, the aim of creating persons is important and bears directly on the issues surrounding the use of controlled research in the classroom.

3.3 Informed Consent and Student Obligation

These points lead to the issue of informed consent, on which Moor and Overall disagree. Moor’s negative view of offering informed consent is based on two points. First, it is the instructor’s prerogative to select an instructional technique and a student’s preference is not a deciding factor. Second, curricular and scheduling pressures subvert the process of obtaining freely given consent. Nevertheless, neither of these points seem strong enough to warrant the abandoning of informed consent. First, a student’s desire to avoid being subjected to a particular treatment condition may be not so much a desire to select a preferred instructional technique, but more a desire not to be subjected to “experimental” or unproved practices. This is certainly a different question than that of whether a student’s preference for some instructional technique should be decisive. Normally, the instructor’s reasoned judgment in selecting a suitable technique outweighs a student’s preference. This decision may be both complicated and difficult. One technique, or set of techniques, is often selected for the class as a whole, even though it may benefit some students more than others. Nevertheless, the selection is made on the basis of the welfare of current students, however conceived. But in the case of controlled experiments, students are not assigned treatments on this basis. The conflict between the welfare of current students and that of future students is evident in that the instructor is not choosing a technique for current students with their interests paramount. This is not a standard classroom situation, and a student’s desire to avoid an experiment should not be settled in terms of whose preference is normally decisive. Second, Moor is correct that curricular pressures may undermine informed consent, since a student may not be able to avoid a required course and/or enrolling in other sections of the course may be impossible. If the request for informed consent is genuine, however, these difficulties can be defused. One test of whether that request is genuine consists of how the instructor reacts to the situation in which a student prefers not to take part in the experiment yet wishes to remain in the course and make use of the instructional methods normally available. Students whose preference is denied are not genuinely being asked for their consent. Accommodating this preference may require special provisions, but in most cases it is tantamount to assigning these students to the control group (and may require that their data be ignored). Of course, taking this action can jeopardize the reliability of the study’s findings, but instructors should be prepared to accept this consequence. There is good reason to believe, however, that this consequence will rarely occur. When students are given careful explanations of the purpose and significance of the experiment, their willingness to participate is almost universal.(18)

These views are consistent with the claim that instructors have a responsibility not merely to current students but to future students and to improving instruction as well. This claim provokes a related question. If instructors committed to providing optimum instruction have a responsibility to improve education, do students have an obligation to serve as subjects in controlled studies aimed at such improvements? An affirmative response to this question could be based on two points. First, students do have some obligation to contribute to the efforts and risks taken by previous students who helped to improve the quality of contemporary education. Second, this contribution would be in the form of taking a risk for the sake of future students. This risk involves the potential loss of grade and learning. Moor’s resolution, however, reduces this risk to zero. Consequently, a risk-free means of fulfilling an important obligation has been provided.

A negative response, however, could be based on a denial of each of these points. First, even if students do have an obligation to aid in the improvement of education, it is not clear that this contribution must be made by serving as a subject in a controlled experiment. Many improvements have resulted not from such experiments but rather from less formal, less accurately measured, and more slowly evolving forms of trial and error. Second, it is not clear that Moor’s resolution has in fact provided a risk-free means for students to contribute to improving education. Certainly, a student’s grade is protected, but when students do suffer a loss of learning an additional burden is placed upon them. In order to recoup that loss students must expend additional time and energy on the course. It should not be surprising if students are eager to be compensated for loss of grade but rarely inclined to expend the effort required to recoup loss of learning. In making the choice between receiving less learning or expending more effort, students are in a no-win situation. Whether students will face this choice is of course unknown at the outset of the experiment but its mere potential is enough to show that participating in the study is not a risk-free venture. Consequently, it does not appear that students have an obligation to participate in these studies.(19)

4.0 Minimizing the Risks to Students

There is no doubt that, even when students are treated as rational agents, there is a power and authority differential between those students and their instructors. With this in mind, instructors carrying out in-class research should do whatever is feasible to reduce the risks to students, even when those actions appear to have limited impact or are not absolutely required. For example, proposals for instructional research should be submitted to university “research with human subjects” committees for evaluation. Many universities exempt instructional research from evaluation by these committees. Federal policy, as stated in the Federal Register, supports this decision by virtue of the following category of exemption: “Research conducted in established or commonly accepted educational settings, involving normal educational practices, such as (i) research on regular and special education instructional strategies, or (ii) research on the effectiveness of or the comparison among instructional techniques, curricula, or classroom management methods.”(20) Nevertheless, university committees of this type often include student representatives and it is important for instructors to make use of this source of feedback. In addition, it is a good idea to post notices of the intended research in the schedule of classes indicating that research will take place in a particular course or section. Doing so will probably have little positive impact, but the potential for having some impact weighted against the near-zero cost makes this action reasonable.21 Some universities even allow students enrolled in a course involving controlled research to retake the course free of charge. These examples illustrate what should be the researcher’s guiding principle: reduce the risks to students whenever doing so is feasible even if the anticipated benefits are minor.

One question raised by this discussion is whether it is possible to evaluate instructional techniques without any risks at all to students. This possibility is suggested by the following scenario. An instructor teams up with a psychologist who has access to a large pool of students for use as experimental subjects. This often occurs, for example, in large introductory psychology classes. These subjects are randomly assigned to groups which are taught a given subject matter by different techniques. Comparisons of the results are then used for purposes of evaluating the different pedagogical techniques. Since these students have not signed up for a course in the subject matter being taught, the rights and responsibilities of those involved are substantially altered, and it appears that many of the ethical difficulties associated with instructional evaluation can be avoided. This is not to say, however, that ethical questions concerning the appropriateness of requiring introductory psychology students to serve as research subjects disappear. Nor is it to say that this scenario is practical in many situations. But this line of thinking is representative of the attitude that researchers should adopt when stepping into the classroom to evaluate instructional techniques, and it is indicative of the concern for student interests which should be interwoven into their research efforts. Creative thinking about ways of reducing the risks to students should be both encouraged and applauded. Such thinking will eventually produce results.

5.0 Summation

Overall and Moor have each raised relevant questions and made valuable suggestions concerning the ethical problems involved in evaluating educational innovations. The conclusions drawn here, while differing from their suggestions at various points, have the benefit of their analyses. Perhaps the primary conclusion to be drawn is that controlled evaluations of educational innovations are not ruled out by appeals to student rights or the instructor’s responsibility for providing optimum instruction. The choice between controlled studies and the pre/post comparisons suggested by Overall should be made according to the reliability of the evidence required. This in turn should be determined by the extent of educational change introduced. Moor is correct in asserting that there are some cases in which controlled evaluations are appropriate. When these evaluations are carried out, however, students should be asked for their consent. Overall’s insistence on the value of informed consent is justified, and it is important that the request for informed consent be genuine. Students not wishing to participate in such studies should be allowed to remain in the course without jeopardizing either their grade or learning. In addition, Moor’s proposal to compensate for any loss of grade or learning should be adopted. Students are not obligated to serve as subjects in these controlled evaluations, but this claim is best founded upon the aims of education rather than consumer rights or the fact that current students will not directly benefit. Ways of reducing the risks to students involved in instructional research should be actively pursued and implemented wherever feasible even if that reduction is slight.

Professional educators have a responsibility to ensure that their techniques of instruction are effective and to continually strive to improve their efforts. Neither of these responsibilities necessitate the execution of controlled experiments or the introduction of revolutionary innovations. Most educators fulfill their responsibilities in these respects through more mundane yet admirable efforts. However, more radical endeavors are not ethically precluded and can be implemented and evaluated in ways that respect the concerns of both teachers and students and the purposes for which they have come together. Computers certainly have the potential for radically transforming the educational process. In whatever forms that potential is actualized, the empirical consequences and ethical value of that transformation should be carefully assessed.(22)

The University of North Carolina at Charlotte

End Notes

  1. Leslie Burkholder, “Computer Ethics on Campus,” in Terrell Ward Bynum, Walter Maner and John L. Fodor, eds., Computer Ethics Issues in Academic Computing, Research Center on Computing & Society, 1992. (This volume, pages 1 – 12 above.)
  2. This is case number 5 in the Ethical Case Studies in Teaching Philosophy series edited by Philip Pecorino. It is published in full in Teaching Philosophy, Volume 9, 1986, p. 351.
  3. James H. Moor, “Computer-Assisted Instruction and the Guinea Pig Dilemma,” Teaching Philosophy, Volume 9, 1986, p. 354.
  4. Ibid., p. 353.
  5. Ibid.
  6. Ibid., p. 354.
  7. Ibid.
  8. Christine Overall, “Innovation and Injustice,” Teaching Philosophy, Volume 9, 1986, p. 356.
  9. Ibid., p. 357.
  10. Ibid., pp. 357 – 358.
  11. Ibid., p. 355.
  12. Moor, p. 354.
  13. Overall touches on this issue when she asks whether students divided into experimental and control groups are being provided with the “same educational experience.”
  14. Overall, p. 357.
  15. There is a potential problem with comparing student performance over several semesters before and after the introduction of some innovation, however. Many additional changes may occur over extended periods and may thus confound any attempt to attribute observed improvements to the instructional innovation.
  16. One alternative to the standard experimental control group study is provided by a “within subjects” design. Students in this arrangement serve as both experimental and control subjects. That is, each student is exposed to both the experimental and control treatment condition. In respect to evaluating a CAI program, for instance, half of the students might use the program for several weeks while the other half make use of standard methods. Then, the students using the CAI program would make use of standard techniques for the same period of time and vice-versa. Performance measures would be taken at several points during this process for comparative purposes, and several such alternations might be even scheduled. (One practical drawback is that scheduling more than one swapping of treatments may be disruptive from both a curricular and pedagogical perspective.) The advantage of this approach is that, except for order of conditions, students are given equal treatment. The exception just noted is indicative of some ethical difficulty. Subject matter and its level of difficulty vary throughout the course. The CAI program might be most effective when the subject matter is most difficult, and students using CAI during that period would have an advantage. Consequently, Moor’s resolution should be employed as necessary.
  17. K. Strike and J. Soltis, The Ethics of Teaching, Teachers College Press, 1985, p. 63.
  18. Glaskow, Sadowski, and Davis (1977) found that even in experiments involving deception and stress, student ratings of their unacceptability decreased when information was provided about the worth of these studies. In one of my own studies, after explanation of its purpose, students randomly assigned to experimental and control groups were given the opportunity of changing groups, but none chose to do so (Croy, 1991).
  19. Two points are in order here. First, Moor himself does not claim that his resolution makes in-class experimentation a risk-free venture for students. Second, I believe that, despite the complexities just noted, Moor’s resolution makes a significant contribution to the fair treatment of students in this context. It is not, however, without an additional complication. This complication is related to the concept of aptitude-treatment interaction. ATI research aims at discovering whether different students possess different characteristics that may predispose them toward or against various instructional techniques. That is, a given student may possess an aptitude that would make computer-assisted instruction more effective for that individual than some other means of instruction, or vice versa. Students with different aptitudes may well populate both experimental and control groups of an evaluative study and consequently some students may be disadvantaged more than others by assignment to a group which underperforms its counterpart. In fact, some students may not be disadvantaged at all. So, not every student will deserve to be compensated and those who do may deserve to be compensated by degree rather than by a fixed amount. Nevertheless, without prior knowledge of this interaction, little or nothing can be done to correct for it. A study which aims at evaluating an instructional technique will probably precede analyses of aptitude-treatment interactions with that technique. Consequently, Moor’s resolution should be adopted as formulated unless evidence dictates otherwise.
  20. See the final common rule expressed in the U.S. Government’s Federal Register, Vol. 56, No. 117, June 18, 1991.
  21. Part of the cost, of course, will come in terms of subject self-selection. Self-selection, however, is already present to a larger degree and should be increased little by providing advanced notice concerning in-class research.
  22. I would like to thank Jim Moor for his critical comments on an earlier version of this paper.

References

L. Burkholder, (1992) “Computer Ethics on Campus,” in Terrell Ward Bynum, Walter Maner and John L. Fodor, eds., Computer Ethics Issues in Academic Computing, Research Center on Computing & Society. (This volume, pp. 1 – 12 above.)

M. Croy, (1991). “Integrating CAI Development and Empirical Research: Opportunities and Responsibilities,” Computerized Logic Teaching Bulletin, Volume 4, pp. 2 – 12.

D. Glaskow, C. Sadowski and S. Davis, (1977) “The Project Must Count: Fostering Positive Attitudes Toward the Conduct of Research,” Bulletin of the Psychonomic Society, Volume 10, pp. 471 – 474.

J. Moor, (1986) “Computer-Assisted Instruction and the Guinea Pig Dilemma,” Teaching Philosophy, Volume 9, pp. 351 – 354.

C. Overall, (1986) “Innovation and Injustice,” Teaching Philosophy, Volume 9, pp. 354 – 358.

K. Strike and J. Soltis, (1985) The Ethics of Teaching, Teachers College Press.

U.S. Government, (1991) Federal Register, Vol. 56, No. 117, June 18.

Some Effects of Computer Technology on Human Interaction and Individualization in the Teaching of Deductive Logic

Marvin J. Croy
Michael G. Green
James R. Cook

Terry Bynum opened the National Conference on Computing and Values with the question: “Shouldn’t computers serve to enhance and protect human values rather than to threaten them?” One feels compelled to answer this question in the affirmative. To do so, however, assumes that human values can be made explicit and that what counts as serving a particular value can be determined. In the following pages a description is given of a concerted effort to do just this. The effort takes place within an educational context in which computers are used for instructional purposes. There are two important characteristics of this attempt to assess the ways in which this computer use might affect human values. First, the assessment has both empirical and normative components. It endeavors to determine the empirical consequences of using computers in a certain way and to explicate the ethical value of those consequences. Second, this study aims at achieving an important goal in respect to the management of technology, namely, that of predicting and evaluating the consequences of introducing an innovation prior to actually implementing that technological change. At present the attempt to achieve this goal is only at the halfway point. We stand at the end of the first year of a two year study, and consequently the discussion here will focus not upon the particular empirical consequences emerging, but upon the techniques and rationale underlying the determination and evaluation of those consequences.

1.0 Background

The educational computer technology at issue here consists of computer assisted instruction programs for teaching deductive proof construction at The University of North Carolina at Charlotte. The development of these programs and indeed the development of much CAI in general has been stimulated by the promise of increased individualization. The promise has been to deal more sensitively and effectively with aptitude differences among students. These differences have been one of the chief difficulties faced in teaching deductive logic. Initially, two CAI programs were designed to provide the standard benefits of unlimited practice, self-pacing, and immediate feedback, but it was quickly seen that this was far from being sufficient. Student aptitudes and weaknesses varied so greatly that more needed to be done to provide genuine individualization. Much more needed to be known about the particular difficulties students were experiencing, and those difficulties needed to be categorized in ways that suggested helpful remedies. Consequently, the CAI programs were modified to function as data collection devices, thereby providing windows on student trouble spots. As students used the programs, records were kept of their rule application success rates, the particular errors made, and the steps taken in solving proof problems.

There have been three main effects of this data collection. First, by periodically generating class performance summaries, in-class instruction can be geared to current difficulties on a real-time basis. It turns out that different classes have somewhat unique characteristics and the CAI programs help in tracking and responding to widespread difficulties which may change as the semester progresses. Second, at the end of each semester, statistical analyses and pattern matches are used to identify more specific patterns of errors (“pseudo-rules”). The CAI programs have been modified to be sensitive to these patterns and tailor-made error handling routines have been built in.

The identification of pseudo-rules and other types of misconceptions, however, did not fully address the problem of individualization. There was a wide variety of ways in which students failed at proof construction, and there was no guarantee that any given misconception, however prominent in the data at large, would be troublesome for a given student. In addition, there was still much to be learned about student strengths and weaknesses. Consequently, special out-of-class meetings between student and instructor were initiated. These sessions have developed into an important part of the course. During these sessions, student and instructor sit together at the computer while consulting performance records and reviewing particular errors and problem solving efforts. What can result is the identification of idiosyncratic difficulties which are not widely shared among students. Occasionally, misconceptions are discovered which are not only unknown to the CAI programs but which have never before been encountered by the instructor. Overall then, the CAI programs provide both direct and indirect support for individualized instruction based on empirical inquiry into actual student difficulties. The data collecting features of the programs serve to (1) guide in-class instruction, (2) provide the empirical basis for investigating student errors, and (3) facilitate productive meetings between student and instructor.

The use of these student-teacher meetings in the effort to promote individualized instruction is of chief interest here. A number of questions concerning the impact of these meetings have arisen. While the impact upon student learning is certainly relevant, other questions appear to be of comparable or greater significance. These questions concern a wider effect on student behavior and attitudes. For example, introduction of the special student-teacher meetings appeared to coincide with some unanticipated yet beneficial changes in student behavior during class. Some students became more active in class, more inclined to ask questions and request clarifications. This is encouraging given the subject matter in deductive logic. There are few subjects in which one’s inability may be so clearly exposed to the class at large, and this prospect may serve to suppress classroom inquiries and responses.

While these events were promising, there was no clear indication that they were occurring consistently or that they were systematically related to the special out of class meetings. There was in fact no evidence that the special sessions achieved the goal of increased individualization or, even if achieved, that they did so in a more efficient manner than some other approach might. Given the number of hours required for the special meetings, the problem of efficiency seemed significant, but there was another issue that cast a critical shadow over these developments. The design of much current CAI runs counter to the direction in which this project has progressed, and this point bears elaboration in the context of “intelligent” programs and the developing impact of computers on education.

2.0 Intelligent CAI and Individualization

The great promise of computers for individualization in education is often dampened by the impact that this technology may have upon human-centered processes of instruction. In particular, the prospect of replacing teachers by computers looms on the horizon. It is difficult to imagine a use of computers as instructional devices that does not in some way take over some task previously performed by human teachers. With the possible exception of distance education and other programs which provide entire courses, however, relatively few applications are designed to completely replace human teachers. Replacement of teachers by computers can thus be thought of as a continuum with distance education at one end and perhaps supplementary drill and practice lessons at the other. If so, then any instructional computer program can be characterized as replacing human teachers to some degree. With this point understood it is probably best, when contemplating the impact of computers on the educational process, to shift the focus to particular activities of teachers and to the roles which those activities place teachers in. This impact may be magnified as CAI programs become more sophisticated. Application of artificial intelligence research, for example, aims at producing “intelligent” computer-assisted instruction (ICAI). When describing the introduction of an intelligent geometry tutoring system into a high school setting, Chaiklin and Lewis (1988) concluded that “both the role of the teacher and the classroom structure changed significantly when ICAI went into the classroom.”2 In this application students spent most of their learning time with the tutoring system and met with their instructor only during office hours. During these meetings, the teacher played the role of “learning consultant.” Sleeman and Brown (1982) envision intelligent CAI progressing to the point in which programs take up the primary instructional task~ and are only in occasional need of human interaction as a “backup.”3 Much of the development of CAI appears to be moving in a related, if less extreme, direction. Computer programs are often designed to be increasingly sophisticated so as to take responsibility for tasks previously performed by teachers. One consequence is that the role of the teacher is changing. Moreover, this change is making computer programs rather than human teachers responsible for tailoring instruction to individual needs.

The general question raised by these developments is whether teachers will play a primary or secondary role relative to computer programs in the instructional process. But a more particular question has been raised in respect to our current project. Perhaps the possibility of replacing the special student-teacher meetings with a more sophisticated computer program should be explored. Could whatever is achieved during those sessions be attained by having students interact with a computer program only? Exactly what are the benefits which result from the special sessions? In order to answer these questions, a two-year research project was launched. The projects’ aims were not only to document the impact of the current practice regarding special student-teacher meetings but also to determine the consequences of replacing those meetings with a more capable computer program. What differences would there be, for example, between receiving feedback from the computer as opposed to the teacher? The plan for answering these questions is divided into two stages. The first stage is to establish a set of measures and procedures which will reliably exhibit the consequences of the innovation being considered. This has been accomplished during the first year of the project, a year primarily committed to “pilot testing.” The second stage, to be carried out in year two of the project, is to use those measures and techniques in an intensive empirical study.

3.0 First Year Objectives of the Study

1. The primary objective of the first year of this study was to develop a set of measures and techniques which would be sensitive to differences resulting from students receiving human as opposed to computer supplied feedback. In particular these measures were to include:

  • a questionnaire for assessing student attitudes toward the course, instructor, computers, subject matter, and fellow classmates;
  • variables which identify relevant in-class behaviors; and
  • performance measures which would assess any impact on student learning

2. Two sections of PHI – 2105 (Deductive Logic) were to be involved in a within-subjects experimental design in the process of developing these measures.

During the Fall semester of 1990, twenty six students enrolled in PHI – 2105 signed informed consent statements and were divided into two groups. These groups were formed as a result of matching subjects on the basis of previous computer experience. Students provided information about their prior computer experience as part of a logical ability pre-test. Early in the course, students also filled out a questionnaire (of approximately 35 items) which was subsequently repeated three times during the remainder of the course. This initial phase of the course is referred to as time frame 1 as shown in figure 1. (Figures are included below at the end.) During this period students were assigned seats for the duration of the course in order to facilitate documentation of in-class behavior. The procedure of video taping classes was also introduced. Most classes followed a lecture/ demonstration format as the initial concepts and procedures of symbolic deductive logic were established.

This period was followed by time frame 2 which marked the first crucial stage of the study. Students took up the task of learning deductive proof construction and began using two CAI programs designed to facilitate that effort. At this point the class was divided into the two groups which were matched in terms of previous computer experience. Students designated as members of group A attended two special out of class meetings (approximately 20 – 25 minutes in length) with their instructor. During these sessions, the instructor made a diagnosis of current difficulties and a recommendation for addressing these problems. This diagnosis was recorded on a form designed for this purpose. Group B students meanwhile were receiving their diagnostic feedback directly from the computer. The same form filled out during the group A teacher-student meetings was available for group B students as a screen display in one of the CAI programs. Although this process was characterized for students as “receiving feedback from the computer,” the diagnostic feedback was actually entered on a case-by-case basis for each student by the instructor. The instructor generated this feedback by examining records of student performance in exactly the same way as was done during the teacher-student meetings. Consequently, the content of the feedback was very similar, if not identical, for particular observed difficulties whether the student having those difficulties was in group A or in group B.

Classroom instruction during this period was centered around problem-solving activities. For example, a proof problem would be written on the board and students would be given several minutes to individually search for a solution. Then students would describe alternative solutions as the instructor asked questions about the appropriateness of the proposed solutions and particularly about the strategies used in discovering those solutions. In addition, some students usually asked their own questions about the proofs described. This pattern of in-class activity prevailed for much of time

At the end of time frame 2 all students faced an hour long in-class exam emphasizing deductive proof construction. Subsequent to that exam, students filled out a second questionnaire which was an adaptation of that previously given. Two forms of this questionnaire were administered. One form contained elements which asked students in group A specifically about their special meetings with the instructor while the second did the same for group B students concerning their computer supplied diagnosis. Both forms of this questionnaire asked students about their attitudes toward the instructor, their classmates, the course, computers, and about their current performance.

During the next phase of the course (time frame 3), the activities of students in each group were switched. Group A students now received their diagnostic feedback from the computer while group B students began meeting individually with the instructor. The subject matter remained deductive proof construction (although the complexity and level of difficulty increased) and in-class procedures continued the problem solving patterns of time frame 2. Records of student attendance and in-class responsiveness were kept as in time frame 2. At the end of this period, students once again faced an hour-long in-class exam and filled out a questionnaire relevant to their group’s activities.

In the final phase of the course (time frame 4) the subject matter shifted toward the application of deductive logic to natural language argument reconstruction. No CAI programs were used during this period, nor were any special student-teacher meetings held. Although this was not a crucial period for research purposes, the monitoring of student attendance and responsiveness was continued. (Classroom procedures were similar to those of time frame 2 and 3.) At the end of the course, students once again completed a one-hour exam and a final questionnaire. The questionnaire was adapted toward explicit comparisons of the treatments earlier experienced by students.

During the Spring 1991 semester, 29 students enrolled in PHI – 2105 were also studied using the within subjects design. The general structure of the course and pattern of activities were identical to those of the Fall semester. The informed consent statement and relevant portions of the exams also remained identical. However, the questionnaires were refined on the basis of previous findings.

4.0 First Year Results

Prior to presenting actual results, it should be reemphasized that an important component of this study focuses not upon how human versus computer-supplied feedback affects student learning, but rather how this difference affects human interaction and student attitudes. In particular, student activity in the classroom and interaction with the instructor is of interest. This interest grows out of previous experience with the special student-teacher meetings. The introduction of these meetings emerged from the long term development and use of supplementary CAI programs. These programs evolved in a form which provided students with rudimentary information concerning their individual strengths and weaknesses. Eventually, special student-teacher meetings were built around examinations of performance records assembled during the use of the CAI programs. It was at this point that casual observations seemed to indicate that students became more active and responsive in class following these meetings. There seemed to be something about the human contact itself that produced changes in student in-class behavior, and these changes seemed to carry over in lesser degrees to subsequent parts of the course. This supposition, however, was untested prior to our current research efforts.

4.1 Predictions and Actual Outcomes

The research hypothesis put forward was that the difference between human-supplied and computer-supplied feedback would be associated with changes in student behavior and attitudes. Specifically, it was expected that having human as opposed to computer-supplied feedback would do more to promote learning, positive in class activity, and favorable student attitudes toward the course, their classmates and their instructor. Given the previous experience just described, the general pattern of results expected is shown in figure 2. (In this figure, each time frame is collapsed into a single point representing the results obtained for that entire time frame.) Group A receives human-supplied feedback in time frame 2 and computer-supplied feedback in time frame 3. This sequence is reversed for group B. Neither of these conditions were administered to either group during time frame 4. Notice in figure 2 that group A’s mean is higher than group B’s during time frame 2 but that the reverse is true for time frame 3. This outcome would support the hypothesis that having human-supplied feedback results in higher scores on the measure taken. In any event, the essential feature is the “crossing pattern” of the lines generated by the particular measure, and it is this pattern that will be searched for. It may well be the case that other patterns of results can support the hypothesis being explored, particularly given that carry over effects from one time frame to another may occur. Nevertheless, the crossing pattern between time frames 2 and 3 would provide the clearest indication of support for the hypothesis and of the sensitivity of the measures being used.

4.1.1 Behavioral Measures

One hypothesis put forward on the basis of previous experience, then, is that students receiving human-supplied feedback would be more active in the classroom than students receiving computer-supplied feedback. “More active” has been defined here in terms of three behaviors: (1) raising questions during class; (2) answering questions posed by the instructor during class; and (3) consulting the instructor immediately after class concerning some aspect of the subject matter. A record of the frequency of each of these behaviors for each student was maintained throughout the semester. The results of these measures for each semester are shown in figure 3. The time frames of most interest in each chart are time frames 2 and 3. Recall that in time frame 2, group A students received their diagnoses from the instructor while group B students received their diagnoses from the computer. During time frame 3, the reverse is true. Each of these charts demonstrate that the mean number of in-class responses observed during time frame 2 is higher for group A than for group B. That is, students receiving human supplied diagnoses were more active in class. The same is true in time frame 3 when group B students begin receiving their feedback directly from the instructor. It is clear that the greatest difference observed between groups occurs during time frame 3 and that the overall pattern of the differences observed are similar across both semesters.

Thus far nothing has been said about the extent to which students attended or failed to attend class during these time frames. This is important since attendance rate would affect the total number of in-class responses observed for particular students and their groups. One way to separate this factor out is to divide the frequency of in-class responses by the number of classes attended. This produces a measure of in-class responses per classes attended for each student. The group means which result are shown in Figure 4 for the various time frames. (Because of late student drop-add activity, a seating chart was not established until time frame 2 for the Spring semester. Consequently no observations are given for time frame 1 during that semester.) Again, the “crossing pattern” expected appears between time frames 2 and 3. Time frame 3 continues to produce the largest differences observed. Class attendance rates themselves are provided for each semester in figure 5. The expected crossing pattern appears in the Spring data but not in that for the Fall.

4.1.2 Performance Measures

As stated earlier, the effect of the treatment conditions administered on student performance in the course is not the central focus of this research, but Figure 6 provides a comparison of groups in respect to exam scores during each semester. The expected crossing pattern appears between time frames 2 and 3 in the Spring but not in the Fall. This exam performance can be broken down into two component tasks which are related to the two CAI programs used by students. Each exam contained a section in which students assessed given justifications for proposed inferences and a section in which complete proofs were constructed. These tasks correspond to the two CAI programs used and the type of diagnoses provided. The results for the justification task are shown in figure 7 for both semesters. Here, the predicted crossing pattern appears in the Fall but not in the Spring. With respect to the task of constructing proofs (Figure 8), the crossing pattern appears in the Spring but not in the Fall.

4.1.3 Questionnaire Results

The data obtained by questionnaire is the most complex and challenging with respect to systematic analysis. Questionnaires were administered at four different points in the course. These points are marked as Q1 through Q4 in Figure 1. Each questionnaire contained approximately 30 – 40 items depending upon the time frame and semester in which it was given. Being explicit about what should be expected in terms of results here is more difficult than other measures since previous experience with student attitudes and their expression during the course is limited. The hypothesis previously put forward that having special student-teacher sessions would result in greater in-class responsiveness is, however, suggestive. It suggests that the special sessions would result in more favorable attitudes concerning various aspects of the course and students’ experiences in it. As will be seen below, analyses of the questionnaire data produce only partial confirmation of this expectation.

The analysis of questionnaire data proceeded as follows. The intercorrelations between items on the Fall questionnaires were examined in an effort to identify scales of similar content or meaning. Four scales were initially developed. These scales assess student attitudes towards the instructor, other students in the class, computers, and the course. Another scale was constructed to gauge the students’ estimate of the helpfulness of the special sessions with the instructor and the summary feedback from the computer. A final scale assesses student perceptions of the degree to which they interacted with their classmates both within and outside of class. Each of these scales is composed of two to eight Likert-type items (ranging in numeric value from one to five).

Student attitudes toward the instructor were assessed by the following items:

1. The course instructor cares whether or not I learn the material.
2. I am receiving as much individual attention for this course.
3. The instructor deals with me reasonably and fairly.
4. I have had adequate opportunities to talk with my instructor.

Figure 9 presents the results for each time frame and each semester in respect to expressing favorable attitudes toward the instructor. There is no crossing pattern exhibited in the Fall data between time frames 2 and 3, although there is a slight tendency toward convergence. However, the expected crossing pattern does appear in the Spring.

Attitudes towards fellow students were assessed through items which asked if “other students in this class seem: friendly, supportive, intelligent, helpful and considerate” respectively. The scale indicating the expression of a favorable attitude towards other students in the class is charted for both semesters in figure 10. Nothing revealed in the Fall data supports the hypothesis that having human as opposed to computer-supplied feedback affects this attitude, but the Spring results do show the expected crossing pattern.

Attitudes towards computers were assessed through the following items (except in the Fall semester when only the first 3 items were used):

  1. I like working with computers.
  2. I prefer working with a computer to working together with another person.
  3. My experiences with computers have been enjoyable.
  4. I look forward to working with the computers in this class (used in Spring only).

Figure 11 presents the results from the scale assessing favorable attitudes toward computers. While a convergence appears between time frames 2 and 3 during the Fall, there is no indication in the Spring data of any changes that might be related to the treatment conditions administered.

Students’ attitudes towards the course were assessed by the following items:

  1. I feel like I understand the material in this course.
  2. This course meets my particular needs in learning logic.
  3. My performance in this course is helping me feel good about myself.
  4. I enjoy working with abstract concepts.

Figure 12 exhibits the results for both semesters concerning student attitudes toward the course. These results are of interest because they suggest an opposite effect of that predicted. In the Fall, for example, group A attitudes are more favorable than those of group B as expected in time frame 2, but these differences increase rather than decrease in time frame 3. In the Spring, group B students express more favorable attitude during time frame 2, and this is clearly the opposite of what’s expected.

Students’ assessment of the helpfulness of the special sessions and the computer feedback was derived from the following items. My individual meetings with the instructor (or summary diagnostic report):

  1. Filled an important need for me in this course.
  2. Were worth the time spent.
  3. Helped me to identify and correct my weaknesses in applying the logical rules.
  4. Helped me to feel better about my chances of doing well in this course.
  5. Were probably more helpful than getting recommendations generated by the computer.
  6. Provided me with a clear indication of what I need to know to do better in the class.
  7. Were a convenient way to get feedback.
  8. Provided me with as much information as I needed.

Using these same items, the fourth questionnaire was modified to have students make a direct comparison based on their experience as to whether the special meetings or the report from the computer was more helpful. The results show that students’ perceptions about the helpfulness of these alternatives varied from the Fall semester to the Spring semester. During the Fall semester, the students did not perceive the meetings with the instructor as being more helpful than the computer feedback. Instead, one group of students had a slight bias towards giving more positive evaluations at each time period. In the Spring, though, there were large differences between groups, with the special sessions being viewed as more helpful.

During the Spring semester only, the following items were used to assess the degree to which students were interacting with their classmates. The first two items were combined to measure how often students interacted during or near class time, while the last two items were combined to assess their interactions away from class.

  1. During the past two weeks, I have talked with fellow classmates before, during or after class about other class material.
  2. During the past two weeks, I have talked with fellow classmates before, during or after class about other matters.
  3. During the past two weeks, I have talked with fellow classmates about class material away from class.
  4. During the past two weeks, I have talked with fellow classmates about other matters away from class.

Students having special meetings with the instructor (Group A) consistently indicated that they interacted with their classmates more than did Group B. There was also a general upward trend for each group, with more interaction occurring later in the semester. This would, or course, be expected in most classes, as students become familiar with fellow classmates and become more likely to interact with them.

5.0 Assessing the Ethical Implications

Before discussing the implications of these findings it should be reemphasized that they are both partial and preliminary. Indeed, without attending to their statistical significance as subsequent analyses will, the term ’findings’ is premature. Nevertheless, this data does suggest a number of possible outcomes which can provide a basis for ethical reflection in this context. For example, if having special meetings with the instructor turns out to be associated with increased in-class student responsiveness, questions about the worth of that responsiveness arise. In what follows, this question and others of a similar nature will be explored primarily for the purpose of sketching a framework for assessing the ethical value of such possible results. In addition, more will be said about the ways in which this study serves to predict the consequences of a technological innovation prior to its implementation.

At the outset of this report, the present study was characterized as a dual process of documenting empirical consequences and explicating the ethical value of those consequences. Thus far the focus has been upon the empirical component. Yet questions concerning the normative component are of equal importance. How exactly is the value of some empirical consequences to be explicated? What determines whether a given result is or is not of value? One approach to be explored here stems from John Dewey’s view that “the specific values discussed in educational theories coincide with aims which are usually urged.”4 This suggests that the value of some outcome might be assessed by means of its relationship (or lack thereof) to educational aims. That is, a given outcome may be shown to be an instantiation of some value which is implicit in the expression of an educational aim.

If statements of educational aims are to provide a basis for determining the value of particular pedagogical outcomes, a number of issues must be addressed. First, educational aims are wide-ranging. They vary from one institution to another and may at times seem inconsistent, as in the attempt to teach both cooperation and a competitive spirit. Statements of educational aims also differ on a scale that ranges from more general to more specific. Controversy may arise over what specific actions or policies serve as a means to some generally-stated end. The diversity in educational aims is also evident in social expectations. Students are expected to learn much more than the sum total of the course content in their educational careers. They are expected to acquire a sensitivity to individual differences and a respect for the right of others, to learn to communicate, cooperate, and compete in positive ways, to develop a healthy self-concept, etc., even though there are rarely any particular courses in these “subjects.” Educational values may thus be reflected in the wider aims of education which transcend particular disciplines. The complexity produced by the wide range of educational aims can be appreciated by pursuing some of the possible outcomes of the present study. Assume, for example, that the consequences of having student-teacher meetings in the present context is that students become more active and involved within the classroom but that this activity does not facilitate learning. Assume further that students preferred receiving their feedback and diagnoses from the teacher rather than from the computer, and that they developed more positive attitudes toward the instructor and their classmates as a result of the special meetings. Would these results be valuable? In particular, would they be worth preserving by refusing to replace the human-supplied feedback by computer-supplied feedback? The value of increased learning and performance is clear because of its relationship to a central aim of education, but the value of increased responsiveness in class or more positive attitudes, when divorced from any increased learning of the subject matter being taught, is not so clear. These results would supposedly be of value by relationship to the wider aims of education. These aims are more difficult to identify and their importance relative to the more central aims is certainly difficult to assess precisely. How much would one’s judgment change, for instance, if the positive impact on student responsiveness and attitudes were accompanied by a decrease in subject matter learning?

These questions point in the direction of future inquiry which will go hand-in-hand with continued empirical research. Much of that inquiry will explore the use of statements of educational aims as a framework for assessing the value of particular empirical results. Can these aims be reliably identified? Can their corresponding values be explicated? What sorts of considerations justify educational aims, particularly those placed in the “wider” category? What kinds of educational theories would make the wider or peripheral aims more central? These and related questions will be pursued during the second half of this study. As already noted, they constitute elements of the normative aspects of a study which has both empirical and normative components. It should be clear, however, that these components are not independent of each other. The idea that the consequences of some innovation can be first determined and then evaluated in an ethical light is too simplistic. One does not determine THE consequences in the sense that all are considered and measured. Rather, only certain consequences, already suspected of being ethically relevant, are explored. Something similar happens when putting a research hypothesis to the test. The observations made in that test are selected from a large number of possibilities. Only those which are logically related to the hypothesis are considered. Likewise, the possibility of having ethical relevance guides the empirical focus of research concerned with actual outcomes and their value. It also provides the impetus for further inquiry into the foundations for that possibility. What results is a process of refinement and clarification in which the relationship of certain outcomes to certain values is made more explicit.

One consideration of significance in the conduct of this research involves the issue of managing technology. The question of whether the student-teacher meetings should be replaced by student-computer interaction has been the prime motivation for this research. Much of the research design has been built around this concern, and the way in which this concern is addressed bears further discussion. In the descriptions given above, the expression “human-supplied versus computer-supplied feedback” or some equivalent is often used. This may give the impression that the only difference in treatment between the two student groups is whether the diagnostic feedback comes directly from the teacher or from the computer. But actually there are other associated differences. For example a student may ask a question and receive an answer during the special meeting, or a student may express some doubts and/or fears and receive some reassurance and/or special motivation. Neither of these interactions are possible with the computer-supplied feedback. So, the two student groups are not being treated equally except for the origin of the information they receive. There is a difference in the information itself.

Earlier it was stated that the content of the diagnostic feedback was identical for a given student difficulty regardless of the origin of the feedback. This statement is in need of some elaboration, however. It is evident that students receive other types of information besides that of diagnosis and prescription. Also, students may demonstrate additional weaknesses during the course of their individual meetings and thus receive augmented diagnoses.

All of these complexities are recognized in the present research design. The main concern has been to determine, prior to implementation, what the consequences would be of replacing the special meetings with an expanded computer program. In that regard, the differences between having human-supplied or computer-supplied feedback in the context of the current study should be identical to the shift, from one semester to the next, of changing from a course based on human-supplied feedback to a course based on computer-supplied feedback. This shift is in fact simulated in the existing research procedures. The main objective of marshaling sound evidence concerning the empirical consequences of this potential shift is thus secured. Attaining this objective supports the effort to intelligently manage computer technology. It does so by contributing to the accurate prediction of the effects of introducing a technological innovation prior to actually implementing that innovation. If having computer-supplied feedback results in an increase in learning and other desirable results, without an increase in undesirable results or a decrease in previously obtained advantages, then work on producing a system for generating computer-supplied feedback should begin. At the very least, the human-generated, computer-delivered feedback which is actually provided during this study should be continued. It may turn out, of course, that different types of students may react differently to the different treatment conditions. That is, students may possess various characteristics which predispose them toward increased learning and positive attitudes under one mode of feedback as opposed to the other. This prospect opens up a number of possibilities that will be investigated during the second year of this study. That investigation will address the opportunity for applying the results of this research toward increased individualization. Increased individualization has been one of the chief motivations for the development of computer-assisted instruction. This opportunity also opens the door to explicating the role of individualization as a value and aim within the American educational system. The combined empirical/normative emphasis of this research will thus continue.

Conclusions

Increased individualization has been one of the chief motivations for the development of computer-assisted instruction. Whether individualization can best be provided by well-programmed computers or by human teachers is an as yet undecided empirical question. That question is extremely complex. A wide variety of consequences given each alternative should be examined, and judging the value of those consequences is ultimately a matter of ethical appraisal. The ethical complexities will doubtless be as recalcitrant as the empirical ones, but studies which simultaneously confront the empirical and normative dimensions of the matter offer some hope. Defining and justifying the aims of education, their corresponding values, and the particular practices which instantiate those values is a challenging task. Accomplishing this task will require the resources of educators and psychologists as well as philosophers, but it appears to be a challenge which must be met if new educational technologies are to be put to intelligent and careful use.

University of North Carolina at Charlotte

Notes

  1. This material is based upon work supported by the National Science Foundation under Grant DIR – 8921033. The Government has certain rights in this material. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation. The authors are grateful for consultations provided by Dr. Robert J. Cavalier of the Center for the Design of Educational Computing, Carnegie-Mellon University.
  2. Sleeman and Brown, 1982, p. 9.
  3. Chaiklen and Lewis, 1988, p. 84.
  4. Dewey, 1916, p. 271.

References

S. Chaiklen and M. Lewis, (1988) “Will There Be Teachers in the Classrooms of the Future?… But Then We Don’t Think About That,” in R. McClintock (ed.) Computing and Education, Teachers College Press.

J. Dewey, (1916) Democracy and Education: An Introduction to the Philosophy of Education, Macmillan.

D. Sleeman and J.S. Brown, (1982) Intelligent Tutoring Systems, Academic Press.

Appendix – Report on the Discussions of the Working Group on Policy Issues in Campus Computing

National Conference on Computing and Values
Report on the Track: Policy Issues in Campus Computing
Report on the Discussions of the Working Group
on Policy Issues in Campus Computing

Marvin J. Croy

This report constitutes a summery of the discussions of the Working Group on Policy Issues in Campus Computing which met at the National Conference on Computing and Values in August of 1991. The aim of this summary is not to provide solutions to problems concerning computer ethics on campus. Rather, it is to communicate the concerns expressed by those involved in the discussions primarily for the purpose of identifying and effectively characterizing existing problems. The identification of these problems should serve as a stimulus to the budding Research Center on Computing and Society and hopefully will suggest a direction and focus for its future growth. The working group on policy issues in campus computing was composed of the following persons:

Della Bonnette-University of Southwestern Louisiana
David Bridge-Smithsonian Institution
Leslie Burkholder-Carnegie-Mellon University
Marvin Croy-University of North Carolina – Charlotte
Antonio De Vido-Pennsylvania State University
Jan Eveleth-Yale University
Timothy Foley-Lehigh University
Richard Gordon-University of Delaware
Beth Kevles-Massachusetts Institute of Technology
Carol Noyes-University of Hartford
Sue Stager-University of Indiana
T. C. Ting-University of Connecticut
Lee Watkins-Johns Hopkins University
Sally Webster-State University of New York/CESF
Frank Wozniak-Southern Missouri State University

The members of this group, although fewer in number than those in other working groups at the conference, brought with them considerable experience relevant to many of the practical problems of computer ethics on campus. Many in this group held administrative positions or were computer professionals who had a direct hand in formulating, interpreting, and enforcing rules which governed the academic and administrative use of computers. There was one clear sentiment repeatedly expressed by this group. Many complained that the policies they had to work with were either vague, contradictory, or nonexistent. In short, these policies, even when they existed, were difficult to interpret and apply in a consistent manner. Many in this group stated that they often found themselves in situations in which definite decisions were required. Yet lacking appropriate precedents and paradigms, they were forced into making judgment calls on an ad hoc basis without the kind of professional support they desired. Much of the group’s discussions centered around addressing the problems of designing and implementing a code governing the use of computers on campus, and often these discussions led directly to ethical questions concerning the rights and obligations of particular individuals or groups. In addition, the role that the Research Center in Computers and Society might play in contributing to the resolutions of these problems and questions was considered. In the following report, each of these topics is presented in a separate section, although the underlying themes are closely connected.

1.0 Problems Related to Ethical Issues

As previously stated, the issues which dominated the discussions of this working group were related either directly or indirectly to the formulation of an ethical code for computer use on campus. The issues included were wide-ranging, and presenting a list of those issues here would do little to represent their complex interactions. Fortunately, Richard Gordon has supplied an overview of many of these issues in his description of a set of guidelines currently being considered at the University of Delaware. His commentary addresses the question of what aims such policies and guidelines should serve and is included above on pages 33 – 47 as an addendum to this report. The working group’s discussions made it clear that having a coherent framework for making ethically related decisions about what constitutes appropriate or inappropriate behavior is a demanding task. It requires the ability to make values explicit and to specify what counts as serving them in particular contexts. It requires a sensitivity to the rights and concerns of a variety of individuals and groups. Asking the questions “at whom is the policy aimed” and “whom is the policy supposed to protect” often is given different answers by different parties on campus. Attempting to provide equity of access to computer resources, for example, may unearth conflicts which may require the establishment of priorities. Readers interested in these topics should consult Richard Gordon’s document directly.

Some problems associated with the use of computers on campus transcend the physical boundaries of colleges and universities. Many educational institutions are connected to computer networks such as BITNET or INTERNET. These networks support electronic mail and list servers for the discussion of a variety of topics. The problems of confidentiality associated with electronic mail are perhaps obvious given its similarity to paper mail. However, there are a number of questions that plague the use and management of these electronic discussions. One problem, suggested by Tim Foley, concerns the lack of consistency among the regulations governing the discussions or networks which are linked together. “There are different policies for using BITNET, EARN, and NSFNET. Political and religious activism is forbidden on EARN and the definition of activism could easily be construed as limiting one’s constitutional rights. Work needs to be done on some sort of standard policy with some basic concepts of ethical behavior. It appears that restrictions on types of information should be implemented at local sites rather than worrying about things such as political activism or religious activism going across backbone connections.” Of course, the question of what sort of restrictions should apply to public postings on networks comes to a head in the example of the “Love Slave” advertisement presented in Leslie Burkholder’s conference track address. In the next section, more will be said about how these cases lead to questions of ethical principles and their application. It should be clear at this point, however, that the diversity in purpose and scope of various networks produces problems in respect to their regulation, problems which are compounded by bridges which connect these networks.

2.0 Research into Ethical Issues of Computing

The relationship of problems in computer ethics to general theories of ethics and values is made clear in Leslie Burkholder’s description of the “Love Slave” advertisement and the exchanges it stimulated. The analysis of this case leads quickly to the concept of freedom of expression, the arguments of John Stuart Mill, etc. But this raises a question about the status of “computer ethics “itself. Is there really any such thing as computer ethics, an interdependent discipline? Is computer ethics merely the application of well-worn ethical principles to new cases, or does the analysis of the cases lead to new ethical principles? Some believe that not much hangs on the answer to this question, but pursuing it can show something about the complexities of inquiry within this field.

As is well known, one of the chief problems concerning the ethical use of computer technology on campus is the pirating of software protected by copyright. One question pursued at length during the panel discussion on intellectual property and the ownership of software was whether software piracy constitutes stealing. If it does, then these cases all can be straightforwardly handled by means of existing ethical principles. However, some have argued adamantly that software pirating does not fit the conceptual mold of stealing. The point here is that a situation in which existing concepts have difficulty applying to new situations signals the need not only for conceptual change, but also for the reformulation of the relevant ethical principles themselves. Existing principles which cover stealing become irrelevant and new principles built around new concepts are required. To the extent that this occurs, computer ethics is not merely the application of well-known ethical principles and concepts but becomes a field whose uniqueness increases with the uniqueness of computers and associated phenomena. Theoretical work in computer ethics which aims at assessing and reformulating concepts and principles thus takes on an added significance.

In his capstone conference address, John Ladd referred to the false analogies of society that supported the wartime activities of Nazi Germany. This reference is suggestive in respect to theoretical work in computer ethics. Many of the discussions addressing tough questions in computer ethics make use of analogies. These analogies characterize the computer and its use as being similar or dissimilar to other entities or processes. For example, during discussions of computer software piracy, comparisons were made between copying someone’s software and swimming as an uninvited guest in someone’s pool or borrowing an item without permission yet returning it. The prime question here is how analogies should be evaluated and criticized. Can we, for example, know that a societal analogy is faulty when it is initially presented, or must we wait until that analogy has been played out through history? How should analogies which emphasize some features of computers and their use while de-emphasizing other features be sorted out? Pursuing these questions leads to well-developed inquiries in logic and epistemology and ties computer ethics to traditional fields of philosophy.

3.0 The Role of RC/C&S

Many of the difficulties involved in computer ethics on campus require particular decisions in response to practical problems. The foundations for these decisions are connected to ethical principles and theory. Members of our working group recognized this diversity and proposed that the future activities of the Research Center be broad enough to speak to all levels of computer ethics as it relates to campus computing. A number of the proposals made by this group are listed below.

3.1 Providing Access to a Wealth of Information

One widely agreed upon view of the future activities of the Research Center revolves around its function as an information resource. The Research Center is seen as a repository for a wealth of information concerning computer ethics. This information should be categorized and stored for easy access and should be wide-ranging in scope. A number of different types of information should be included. First, the repository should contain an annotated bibliography plus abstracts of relevant books and articles. These should include not only recent works on computer ethics but also empirical studies of the social and psychological effects of using computers. Second, a collection of cases for analyses should also be stored and categorized. These cases would include descriptions of dilemmas and conflicts which arise within the realm of campus computing. Third, statements of existing policies and procedures in effect at various universities should be collected.

3.2 Reviewing Proposed Policies

Providing collections of existing works, dilemmas, and policies can be thought of as a descriptive function of the Research Center. But there is also a normative function to be served. The Research Center is in a unique position, by virtue of its close ties with philosophers, university administrators, and experienced computing professionals, to offer evaluations of existing dilemmas and policies. These evaluations would provide a form of peer review particularly for those in the early stages of formulating their own policies or guidelines. For example, after consulting a number of relevant examples, a draft of a set of guidelines might be drawn up and then submitted to the Research Center for forwarding to those with experience in designing and implementing such policies. These reviewers would provide assessments of the strengths and weaknesses or potential difficulties of the documents submitted, with particular reference to their own experience and to existing policies at various universities. Statements of those policies, of course, would also be on file for easy reference. Reviewers might also offer advice concerning strategies for building a consensus on campus which would lead to adoption of the formulated policies.

3.3 Supporting Online Consultation

Another possibility which the Research Center should explore is that of providing a means for ongoing, ad hoc consultation. This consultation would function as a hot line for the resolution of pressing problems and could constitute an online service by means of one of the established computer networks.

3.4 Establishing Relationships with Other Organizations

The Research Center should establish a working relationship with organizations having similar aims or concerns, such as the EUIT (Educational Uses of Information Technology) group of EDUCOM. Representatives of the Research Center should be sent to major conferences sponsored by such groups. When different organizations are involved in similar projects, ways of supporting collaboration on those projects should be explored.

3.5 Educating Computer Users

The Research Center should play a role in increasing the awareness of computer users on campus of ethical issues. This goal might be achieved by the Center’s publication of pamphlets or the production of videotapes, short courses, and other training materials or by assisting others involved in this endeavor. This activity might also carry over into the development and distribution of syllabi and other materials for courses on computer ethics and professional ethics for both students and professionals. It might additionally take the form of summer seminars held at the Research Center for the purposes of continuing education for faculty, computer professionals, and administrators.

3.6 Obtaining Private and Federal Grants

There are two possibilities in respect to the role the Research Center could assume in the execution of externally funded research. First, individual researchers would submit their own proposals to various agencies while using the Research Center as a supporting consultant. Second, the Research Center itself would submit proposals for projects supported in part by the expertise of professionals at other institutions. One federal agency that should be targeted for such proposals is the Ethics and Values Studies in Science division of the National Science Foundation. In addition, a program in support of visiting scholars should be established. This program would facilitate creative work and productive interactions among researchers in the field of computer ethics. This work should be diverse, sometimes focusing more upon practical problems and sometimes directed toward analyses of ethical principles and theories.

4.0 Conclusions

Computer ethics is a relatively young field. Establishing research centers, developing policies, initiating collaboration among organizations, and the like, are the first steps of what should integrate intellectual and practical activities and should make good use of existing resources. One resource which should not be overlooked involves the experience and intelligence of the many people who have active interests in ethical issues as they bear on the varied uses of computers. To the extent that this resource is properly put to use, the Research Center will indeed be the centerpiece of a multitude of activities that will lead to creative solutions to important problems. If that occurs then the National Conference on Computing and Values will certainly be seen as the kind of ground breaking that allows many seeds to be cultivated for better growth. The working group on policy issues in campus computing is pleased to have had a hand in the sowing of those seeds.