Electronic Voting Systems and the Therac-25: What have we learned?

AUTHOR
William M. Fleischman

ABSTRACT

Between June 1985 and January 1987, a series of accidents involving the Therac-25 medical linear accelerator caused severe injuries to six cancer patients. Three of these patients died as the result of massive radiation overdoses to which they were exposed. The accidents were found to have been caused by the failure of software that controlled safety critical operations of the Therac-25. A thorough retrospective analysis of these accidents undertaken by Nancy Leveson and Clark Turner revealed that, from an engineering standpoint, the Therac-25 was a poorly and carelessly designed system. More generally, their analysis points to failures at a higher level of abstraction in the systems of medical treatment in which the Therac-25 was utilized, as well as failures in the regulatory regimes meant to protect the public through prior approval and oversight of the use of such medical devices.

The Therac accidents are widely studied in courses or modules devoted to the ethical responsibilities of professionals in the computing field. It is difficult to imagine that they did not influence the authors and the content of the various professional codes of ethics – for example, the 1992 revision of the Code of Ethics of the ACM and the Software Engineering Code of Ethics promulgated in 1999.

Since the introduction of electronic voting systems following the passage of the Help America Vote Act (HAVA) in 2002, numerous studies – we cite, among others, investigations by teams at Johns Hopkins University, Princeton University, the University of California at Berkeley, and the Center for Election Integrity at Cleveland State University – have disclosed serious and unsettling flaws in virtually all of the electronic voting devices marketed and in use in the United States. In addition, experience in the use of electronic voting devices in recent elections has confirmed their fallibility. Ballots have been inexplicably lost from or added to vote totals, direct recording electronic devices (DREs) have provided incorrect ballots, machines have failed to operate at the start of voting and have broken down during the course of an election, memory cards and smart card encoders have failed during elections. Since HAVA was intended to prevent problems like those encountered in the contested 2000 Presidential election, these shortcomings have created the unsatisfactory situation in which the purported remedy for problems associated with the conduct of fair elections has in actuality served to intensify public doubts about the electoral process. By analogy with the case of the Therac-25, the software controlling these electronic voting devices can be considered “safety-critical” in the sense of safeguarding the integrity of elections on which public trust in the legitimacy of elected governments rests.

Carefully considered, these studies of the deficiencies of electronic voting systems reveal numerous striking analogies with the engineering and system failures diagnosed in the case of the Therac-25. The analogies begin at the level of operation of the devices themselves, in particular the presence in each instance of chronic “minor” malfunctions which must somehow be ignored or explained away in order not to undermine belief in the trustworthiness of the devices. At a higher level, investigations uniformly disclose the absence of defensive design, overconfidence in the infallibility of software, inadequate software engineering practices relating to safety and security, conflation of user friendly with safe interface design, and, most pointedly, inadequate or nonexistent documentation.

In comparing the medical treatment systems in which the Therac-25 was utilized with the systems of state and local election boards which form the “customer base” for electronic voting devices, we find the same pattern of articulation failure within organizations, failures of due diligence, complacency involving unwarranted trust of the vendor and tolerance for fault-prone devices, and inadequate training of personnel.

At the level of the vendor or manufacturer, the analogies that begin with poor engineering practices already cited, extend further to the predilection to rush the product to market while overselling its reliability, the absence of documentation and audit trails concerning adverse incidents, inadequate response to such incidents, and evidence of willingness to bear the cost of penalties rather than undertake necessary engineering revisions.

Finally, the situation at the level of regulatory regimes seems even less satisfactory in the cases relating to present-day electronic voting devices than it did in the 1980s in connection with the radiation accidents associated with the Therac-25. The same problem of the absence of regulatory personnel with the technological competence to evaluate system shortcomings appears to plague the current case as it did at the time of the approval and oversight of the operation of the Therac-25. At the same time, there appears to be a widespread belief at present that the regulatory system will somehow fix everything.

In this paper, we will explore the analogies between the deficiencies of the Therac-25 and those of present-day electronic voting systems by laying out in some detail the elements of similarity described above, paying particular attention to system interactions at several pertinent hierarchical levels. We will try to come to grips with the question of what has been learned – especially in regard to responsible practices of software engineering – by past experience in relevant circumstances and what makes it so difficult to avoid the vexatious repetition of certain unsatisfactory patterns of behavior, even in the presence of admonitory precedent. Finally, we discuss some possibilities for incorporating the insights offered by the comparison of these two cases in courses on ethics and professional responsibility, especially in the education of those aspiring to careers in software engineering.

How Technology can make Green Greener – A Closer Look at Curriculum in Construction Management

AUTHOR
Boyd Fife

ABSTRACT

The building industry in the United States has a tremendous impact on the environment, human health, and economy of the country. Buildings consume 38- 40% of total U.S. energy, 71% of the electricity, and 12% of the water . Building demolition, remodeling, and construction generate over 35% of non-industrial waste . Air pollution in buildings can cause health problems with pollutant concentrations between two and five times greater and sometimes more than 100 times greater than those of outdoor air . Most Americans spent more than 90% of their time in buildings yet indoor environments in some buildings have been associated with human health impacts that range from asthma and respiratory tract irritation to Legionnaires’ disease and cancer .

When considering carbon dioxide the country’s operation of buildings produces 38% of all U.S. emissions. One of the foremost efforts to curtail global warming is through the efforts of Architecture 2030 created by world renowned architect Edward Mazria AIA. Architecture 2030 states “Our goal is straightforward: to achieve a dramatic reduction in the global-warming-causing greenhouse gas (GHG) emissions of the Building Sector by changing the way buildings and developments are planned, designed and constructed”. Architect 2030 reports that “there are hundreds of coal-fired power plants currently on the drawing boards in the U.S. Seventy-six percent (76%) of the energy produced by these plants will go to the operate buildings” .

The building industry is the largest economic sector in the U.S. and the second largest manufacturing sector. It is evident by the recent economic recession that only one sector of the industry that of the residential building trade plays a substantial role in the economy of the U.S. Traditionally in the U.S. developers, architects, and builders have considered the immediate costs of constructing a building and have largely ignored the long term energy consumption of buildings.

The U.S. Green Building Council (USGBC) is the nation’s leading nonprofit organization composed of corporations, builders, universities, government agencies, and other non profit organizations working together to promote buildings that are environmentally responsible, profitable and healthy places to live and work . Their research concludes that construction funding for research and development for green buildings is inadequate. Quoting from the USGBC report: “Only about 0.2% (two-tenths of one percent) of all federally funded research form 2002-2004 an average of $193 million per year. These amounts are miniscule compared not only with the environmental impact of the building industry, but also with its economic impact (at $1.1 trillion, it is more than 14% of the U.S. gross domestic product)”. The National Science Foundation reported that construction related research stood at 1.2% of sales . This is well below other industries such as nanotechnology which receives $1.9 billion from private industry and $1.4 billion from public sources in the U.S. This nanotechnology research may even be harmful rather than beneficial to humans and the environment where as green building research would give almost immediate positive results by reducing green house gases, enhancing energy security and preserving resources such as water.

Given the huge impact that the building industry has on the environment, human well- being and the economy of the United States the obvious disconnect between what is needed in the nation and what is actually happening raises serious ethical issues.

This paper is based on an ongoing research where different processes of instruction are being examined in the context of design and construction decisions for “safer” and “greener: buildings. Given the discussions so far, the main research question addressed here is: “Are construction students being taught correct ethical building science principles and is related energy saving software being introduced in the classroom?”

In addressing the research question above, interviews designed for data collection will be directed towards examines of curriculums in construction management including issues linked with energy efficient building practices, and related construction software. Questions will specifically include issues about solar heat gain to reduce the need for electricity and fossil fuels for comfort heating of buildings as well as other high efficiency-climate responsive designs. In addition questions regarding the use of computer programs such as but not limited to Elite Software ix for Heat Vent and Air Conditioning design efficiency, REM/Rate home energy rating software and 3-D Computer Aided Drafting software for Building Information Modeling xi (BIM). Findings of this research will not only enhance awareness of traditional building efficiency, communication using BIM and other energy modeling software but also help in training construction management students and faculty in making buildings that take into account both regulations and social and ethical issues. Fox definition of ethics in the context of buildings is particularly useful. He points out that ethics can be viewed as concepts that we test, qualify, and reconstruct through an on-going, dynamic process of design innovation.

In light of this, this proposed research will significantly contribute to teach green building energy efficient building practices with computer applications within construction programs in the state of Utah. The first step in this teaching process will be the reporting of findings from this proposed research to the Utah State Department of Education’s annual construction conference. The conference will be held June 2010 at Southern Utah University. The proposed research will be shared with construction teachers throughout the state and will determine energy efficient workshops for Utah teachers. The workshops will be designed to train teachers in energy efficient building practices that are found to be weak based on the proposed research.

ENDNOTES

i. See Annual Energy Review 2005. DOE/EIA=0384 (2005) Energy Information Administration, U.S. Department of Energy. July 06 and Estimated Water Use in the United States in 1995. U.S. Geological Survey.

ii. Office of Solid Waste, U.S. Environmental Protection Agency: Characterization of Building-Related Construction and Demolition Cebris in the United States. EPA 530-R- 98-010; June 1998.

iii. The Total Exposure Assessment Methodology (TEAM) Study. EPA 600/S6-87/002. U.S. Environmental Protection Agency. 1987. http://wwwepa.gov/ncepihom/

iv. statistics from sources cited in: U.S. Environmental Protection Agency, Buildings and Environment: A Statistical Summary, December 20,2004, http://www.epa.gov/greenbuilding/pubs.pdf

v. See http://www.architecture2030.org/2030_challenge/index.html

vi. See Available on line at the U.S. Green Building council at http://usgbc.org/ShowFile.aspxDocumentID=2465

vii. Source: Department of Energy. 2006 DOE Buildings energy Data Book. 19 February 2007. http://buildingsdatabook.eren.doe.gov

viii. See Lux research Inc., The nanotech report, 4th Edition, 2006 http://luxresearchinc.com/pdf/TNR4_TOC.pdf

ix. See REM/Rate on line at http://www.archenergy.com/products/rem/

x. See Elite Software on line at http://www.elitesoft.com/

xi. See one companies concept of Building Information Modeling at: http://usa.autodesk.com/company/building-information-modeling

xii. See Fox, “Ethics and the Built Environment” http://books.google.com/books?id=oC2WWKET5NoC&pg=PA82&lpg=PA82&dq=ethical+concerns+in+building+green+buildings&source=bl&ots=j_rsiw6XDG&sig=PT0BgKUsPmCRR3H7GqTKjKcF9Ww&hl=en&ei=TdVkSrmoEeavtgesrpT2Dw&sa=X&oi=book_result&ct=result&resnum=2

Finding a Core Curriculum in Technology Education

AUTHOR
Matthew Edwards

ABSTRACT

Recent evidence in the US indicates that high school students, who have completed more academic subjects than their predecessors, increasingly view academic schoolwork as less interesting, less meaningful, and less likely to be useful later in life (Wraga 2009). Yet we are seeing a big push in the US to increase the amount of mandatory core curriculum courses in the public school system, which in turn is pushing many elective courses out of our schools. The core curriculum is defined and interpreted by most institutions as English, Math, and Science. If interest in a subject matter is an indication of how much we learn, or actually get out of it, then the following research would suggest that our education systems are in big trouble.

An annual survey of twelfth-graders, taken by the Survey Research Center at the University of Michigan’s Institute for Social Research (NCES 2004; Johnston et al. 2005) documented some interesting trends. In 1983, when asked “how often schoolwork is meaningful” 40.2 percent of seniors responded “often or always” and 18.3 percent responded “seldom or never”. However, in 2005, only 27.5 percent responded “often or always” and 28.2 percent responded “seldom or never.” When asked in 1983 “how important school learning will be later in life,” more than half of all the students surveyed responded “quite or very important” and 19.9 percent responded “not or slightly important”; in 2005, 37.1 percent responded “quite or very important” and 28.8 percent responded “not or slightly important.”

The only surprising thing about these trends is that many of those who make decisions affecting school curriculum are surprised. When facts relating to the number of students that attend and graduate from Universities are studied, then these trends of declining interest and faith in education should become more obvious. Current trends of student populations in the US that graduate from high school and continue on to graduate from a university are only at about 23 percent. Generally speaking, the number of students that graduate from universities matches or exceeds that which is needed in the workforce. Of course there are specific areas of need where this not the case. However, this would strongly suggest that the amount and method of teaching the “core” to almost 80 percent of the population may actually be inappropriate and unethical. Learning English, Math, and Science is good and necessary, but is a constant immersion in these courses, using today’s methods of teaching, the only path to success and future employment opportunities? Can ICT courses (Information, Communication Technology) be a part of the core curriculum or vis-a-vis?

A recent report from the National Center for Education statistics, “Special Analysis 2007: High school course taking,” stated: “From the early 1980s, when states began to increase the number of courses required to receive a high school diploma, the average number of credits earned by high school graduates increased from 21.7 credits in 1982 to 25.8 credits in 2004” (NCES 2007). In William Wragga’s article-Toward a Connected Core Curriculum- he states that “The analysis indicates that these increases occurred in academic courses; during the same period, enrollment in vocational courses declined” (Wragga, 2009). Many ICT courses are elective courses in the US school system. The push for more “core” is beginning to have negative effects on class sizes for all elective courses, including ICT courses .

One option is to create an integrated core curriculum which organizes educational experiences around common personal and social problems, with subject matter introduced only as it relates to particular problems that one might encounter in real life situations. Some educators refer to this type of education as “Applied Education”.

One possible solution is to use common computer software, and integrate a curriculum that can take advantage of many disciplines. For example, teach students math by using spreadsheet applications from industry, such as Micro Soft Excel. Many firms in the field of Construction Management use this type of software to estimate volumes, measurements, and costs of structures in the heavy civil and heavy commercial building industry. Integrating curriculum could cover several courses; Math, Technology education, Family Science, Construction technology, and personal finance, and could be taught by teachers in these various disciplines. ICT courses can include a broad range of subject matter and methods. Some teaching methods include but are not limited to: research papers, oral and technical presentations of research, and the development of communication skills using technology. With this type of variety it seems quite possible that ICT courses could integrate a fair amount of core curriculum into the course. All of the possibilities go beyond the scope of this paper, but should be given serious reflection by educational administrators.

A common problem of using an integrated approach arises when we train our teachers to disaggregate curriculum to a point where many teachers have very little experience with practical application. When teaching university courses I have found that very few students have the slightest understanding of how to apply their math skills in the applications that I teach for Estimating and Bidding. For example, basic geometry is taught to builders when squaring large buildings, or figuring roofing materials from areas with varying pitches. Yet, many university students that I have taught have felt that this commonly used math was completely useless outside of trying to get a good grade in the math class. Consequently, I find that I am re-teaching basic math concepts to students who have completed math courses through calculus.

This paper will focus on the popular educational “trends” of today, compared to educational facts that strongly suggest an applied educational focus in non-traditional core subjects such as ICT courses is where positive answers can be found, especially as those applications can be easily linked to quickly evolving technologies in our modern world. I will also emphasize the ethical implications of governments that perpetuate an attitude that narrows the definition of core curriculum into disaggregated subject matter that can only be covered in courses specific to that particular subject.

REFERENCES

Johnston, L.D., J.G. Bachman, P.M. O’Malley, and J.E.Schulenberg. 2005 Monitoring the Future: A Continuing Study of American Youth (12th-grade survey). Conducted by University of Michigan, Institute for Social Research, Survey Research Center. ICPSR04536-v3. Ann Arbor, Mich: Inter-university Consortium for Political and Social Research (producer and distributer). 2007-07-18

National Center for Education Statistics (NCES). 2007. “Special Analysis 2007. High School course taking.” Downloaded from http://nces.ed.gov/programs/coe/2007/analysis/sa02b.asp on August 2007

National Center for education Statistics (NECS) 2004. “12 graders’ Effort and Interest in School” Downloaded from http://nces.ed.gov/programs/coe/2002/section3/tables/t18_la.asp on 12 August 2004

No Child Left Behind Act of 2001. Public Law 107-110, 115 U.S. Statutes at Large 1425 (2002)

Wraga, William G. 2009. Toward a Connected Core Curriculum. Education Horizon 87 no2 Winter

How Does the Evolution of ICTs Change the Law? An Approach to Law Through the Philosophy of Information of Luciano Floridi

AUTHOR
Massimo Durante

ABSTRACT

Law as a normative system has always been concerned with the aim of governing reality. How law can govern reality, however, has also been constantly a matter of endless theoretical disputes. The problem requires legal scholars to explain at least three fundamental terms: law, reality and the relation between them. It demands consequently to elaborate (i) a conception of law, (ii) an ontology of reality and (iii) a theory of normativity. The evolution of Information and Communication Technology (ICT) is likely to redesign all these terms. It does so more radically than it is usually thought. Many scholars do recognize the difficulty.

The proliferation of books and articles on the regulation of the Internet or of Cyberspace clearly shows this much. However, they often fail to acknowledge that such a problem requires a new theoretical approach. For instance, expressions such as “the law of the Internet” or “the regulation of Cyberspace” can be deceptive, since they implicitly suggest that either Internet or Cyberspace are only subject to the regulation of law and they do not have by themselves any regulative attitude. This is not true, since both the Internet and Cyberspace may have, for instance, inherent topological dimensions (Pagallo, 2007), which can exhibit normative values. However, such a critique does seek to reaffirm a current platitude, namely that technology is self-regulating. This would mean to understand technology once more as a normative system (“the code is the law”, as stated in Lessig, 2006).

The problem is to realize that the evolution of ICT is going to redesign the ontology of reality and that this change “can go backwards, forwards and sideways” (Alvin Toffler), according to what is suggested by the topic of the Ethicomp meeting. Therefore, a different statute of reality (a different ontology) can impose conditions or restrictions upon the legal claims and can force us to reconsider our understanding of the law. What is, thus, the ontological statute of reality that law is meant to govern? The Philosophy of Information brought forth by Luciano Floridi (2009a) has best grasped the ongoing process of transformation of the ontology of our reality and has formulated it in plain theoretical terms. This process can be summarised by reference to two neologisms that Floridi has coined: infosphere and re-ontologisation.

“Infosphere is a neologism I coined years ago on the basis of ‘biosphere’, a term referring to that limited region on our planet that supports life. It denotes the whole informational environment constituted by all informational entities (thus including informational agents as well), their properties, interactions, processes and mutual relations. It is an environment comparable to, but different from cyberspace (which is only one of its sub-regions, as it were), since it also includes off-line and analogue spaces of information. […] Re-ontologizing is another neologism that I have recently introduced in order to refer to a very radical form of re-engineering, one that not only designs, constructs or structures a system (e.g. a company, a machine or some artefact) anew, but that fundamentally transforms its intrinsic nature. […] Using the two previous concepts, my basic claim can now be formulated thus: digital ICTs are re-ontologizing the very nature of (and hence what we mean by) the infosphere, and here lies the source of some of the most profound transformations and challenging problems that our information societies will experience in the close future, as far as technology is concerned”, (Floridi, 2007).

In the first part of the paper, I analyse the general understanding of the relation between law and reality in the information society, with reference to the regulation and the governance of the Internet and of Cyberspace. The way in which the problem is approached and dealt with is often quite old-fashioned and impinges on both legal and epistemological categories that are no longer capable to account fully for the changes brought forth by the evolution of ICT. This requires, first, to recall and clarify the stratification of reality (physical, logical, of contents: Benkler, 2006) of the Internet and of Cyberspace that still supports, at least to some extent, the application of traditional legal categories (regulations, norms of hard and soft law, codes of conduct, customary rules, etc.).

At this level, the legal debate is mainly concerned with the opposition between centralised (hierarchy) versus decentralised (network) models of law as well as (in a more mature phase) with the dialectics between decentralised (with intermediaries) and distributed (without intermediaries) networks (Durante, 2008). Such analysis is still consistent with some phenomena that are redesigning the multiagent systems of the Internet and of Cyberspace, like those of “computer clouding” (The Economist, 25 October 2008) or “autonomic computing” (J.O. Kephart & D.M. Chess, 2003). However, it is not sufficient to account for the “reontologisation of reality” (Floridi, 2007) that characterises the “infosphere” (Floridi, 2003).

For this reason, the second part of the paper is devoted to the analysis of the Philosophy of Information laid down by Luciano Floridi. In particular, it is important to focus upon the novelty of his informational approach for three main reasons. In fact, this approach offers us some consistent indications and theoretical tools to deal with three essential notions of law from an informational standpoint: 1) the notion of legal agents (legal subjects) as informational systems; 2) the notion of norm (legal provisions and expectations) as information; 3) the notion of reality (legal objects) as a new informational environment.

1) In the post-industrial society of information, the legal subjects are to be intended also as interacting agents and reagents that share information, messages. The informational approach does not only show how interacting agents and patients communicate and share data by means of positive or negative messages. Thanks to its ontocentric perspective, the informational theory offers a distinct, unified perspective for the varying status and regime that involve the nature of agents, of their relations and the content of shared data. On the basis of an ontological equality principle, all the agents, relations and data are conceived as informational entities that are to be treated as part of the informational environment, or infosphere, to which they belong qua informational systems.

2) In the post-industrial society of information, the capability of the positive norms to predict future behaviours and establish expectations is to be rethought starting from a deeper and systematic analysis of the informational content of a norm (Durante, 2007): the idea of treating the legal norm as an information is not new in the field of philosophy of law. What is novel is the need to treat information as a specific, technical concept, which is part of a semantic reinterpretation of the reality. This requires us exploring and making reference to a specific theory of information, which does not only take into account the syntactic aspects of information but also its semantic dimension (Floridi, 2009b).

3) In the post-industrial society of information, the evolution of ICT is no longer viewed, according to the informational approach, as a matter of applied ethics or empirical or normative approaches, but as something that constitutes a new environment (infosphere) that brings about a reontologisation of reality, made of informational objects that possess their own ontology. This requires us to study how the law can deal with the informational objects and what are the main features that characterise a reontologised reality, since this directly affects the relation between law and reality, bearing in mind that the informational treatment of reality does not constitute necessarily per se the entire picture, i.e. the whole representation of reality: this would require us to confront systematically the informational approach with a more radical digital approach to the understanding of reality.

Consequently, in the third part and last part of the paper, we should try to figure out the guidelines of an informational approach to law, that is to say, a vision of law that would be theoretically capable of dealing with the reontologisation of reality. The ontology on which law has always been based is made of material objects, whose positive existence is the ultimate epistemological guarantee of the content of legal propositions and expectations. Even though – just to make a very simple example – the content of a legal contract is not directly a specific real thing but an exchange of promises, nevertheless the way to measure whether the promises have been fulfilled is to refer to the positive existence of the real things that are the final content of the exchanged promises. How does law change in its relation to reality, when the ontology of reality is no longer based on the positive existence of material objects? The distinction between material and immaterial objects that law is acquainted with is not sufficient to deal with the problem, since such a distinction is drawn within the same ontological framework of reality that law has always been based on, namely, the positive existence of physical objects.

What does it mean to stabilise expectations, when the content of an expectation is no longer a material but an informational object? Furthermore, what does it mean to dispose of an informational object, to destroy it or even to deprive someone of it? This kind of questions could be easily multiplied. We are aware of the fact that it is too early to attempt to answer them. However, it is reasonable to try to delineate a theoretical framework for them. This also requires to focus more closely on the fundamental concept of information (Floridi, 2009b), which plays a decisive role in the construction of a networked normative system.

The examination of Floridi’s Philosophy of Information already points out a significant line of research, since it suggests that:

1) The ontology of reality on which law is based is no longer made of material but of informational objects. This first suggestion has to be taken into consideration in its full meaning, since law is not confronted by informational objects as if they were something that lie in a place different from where law lies: both law and informational objects are part of the same reontologised reality – the infosphere – that constitutes a common environment. Law – as a normative system – still has the role of reducing the complexity of the environment: however, the question is now how to deal with the growing complexity of the informational environment.

2) The aim of law, in reducing the complexity of the environment by stabilizing expectations, is to rethink in relation to the specific meaning of information as well as to the structure of the information cycle. The Philosophy of Information helps clarifying what information is, what the entire life-cycle of information is and, notably, what its informativeness consists of. This point is crucial and allows us to draw an analogy between law and information: both legal expectations and shared information pose the problem of the conditions of management of data and of the reliance on their content. Since multiple, distributed sources of information have been added to the traditional storage devices, the issues of managing data and of filters of attendance have become essential, with the problems of agency that such managing systems or creation of filters bring along. This problem is directly connected with that of the terms of accountability (responsibility and imputation) for the actions and the decisions we take on the basis of the information we rely upon.

3) The relation between possibility and contradiction can bridge the legal and the informational system. The content of both a norm and of some information has to be something that is possible. The content of a formal possibility is, to speak in Kantian terms, what is not contradictory. Hence, as law grows out of contradictions that it tries to overcome by delimiting a realm of possibilities, a theory of information has to solve “the semantic paradox affecting the classic quantitative theory of semantic information, according to which contradiction contains the highest quantity of information” (Floridi, 2004). In this perspective, we should understand whether we may apply to law what Floridi (2004) says about information: “Information is an actual possibility that is consistent with at least one but not all other possibilities. A contradiction is not information-rich because it is not a possibility, and a tautology is not information-rich because it does not exclude any possibility”.

REFERENCES

Benkler Y. (2006), The Wealth of Networks: How Social Production Transforms Markets and Freedom, Yale University Press, New Haven CT.

Durante M. (2007), Il futuro del web: etica, diritto, decentramento. Dalla sussidiarietà digitale all’economia dell’informazione in rete, Giappichelli, Torino.

Floridi L. (2009a), The Philosophy of Information, Oxford University Press, Oxford, Forthcoming.

Floridi L. (2009b), Information, Oxford University Press, Oxford, Forthcoming.

Floridi L. (2007), “A look into the Future Impact of ICT on Our Lives”, The Information Society, 23.1, 59-64.

Floridi L. (2004), “Outline of a Theory of Strongly Semantic Information”, Minds and Machines, 14.2, pp. 197-222.

Floridi L. (2003), “On the Intrinsic Value of Information Objects and the Infosphere”, Ethics and Information Technology, 4.4, pp. 287-304.

Kephart J.O. & Chess D.M. (2003), “The vision of autonomic computing”, IEEE Society, online at: www.research.ibm.com/autonomic/manifesto.

Lessig L. (2006), Code: And Other Laws of Cyberspace, Version 2.0, Basic Books, New York.

Pagallo U. (2007), “Small world” Paradigm and Empirical Research in Legal Ontologies: a Topological Approach, in The Multilanguage Complexity of European Law: Methodologies in Comparison, edited by G. Ajani, G. Peruginelli, G. Sartor, and D. Tiscornia, European Press Academic Publishing, Florence, pp. 195-210.

Responsibility Ascriptions and Organizational Structure: The Case Of Oss Communities

AUTHOR
Neelke Doorn

ABSTRACT

In the literature on Ethics and ICT ample attention is given to the responsibilities of programmers and system designers. The central questions relate to topic of ‘responsible computing’, ‘responsible use’ or blame for harm due to erroneous software, either intended or unintended. Recently, Deborah Johnson and Thomas Powers have enriched the discussion on ICT and responsibility by focusing the attention to the role of the computer system itself and its complexity (Johnson and Powers 2005). An analysis of moral responsibility without paying attention to the computer system itself is incomplete, the authors argue. Johnson and Powers mention a threefold complexity in the relation between responsibility and computer systems: the ontological complexity, which is related to the many people involved in the development and use of ICT systems (the so-called problem of many hands), the conceptual complexity, which refers to the ambiguity in the way the concept of responsibility is used, and lastly the technological complexity, which relates to mutual connections between the ICT system, its user and its developer, in assessing technological moral action. The paper is an important contribution to the discussion of moral responsibility within the context of technology development, which sometimes seems to be too narrowly focused on individual moral responsibility without taking into account the way the artifact itself limits or enables certain courses of action. However, one important element seems to be missing in this account of responsibility, namely the organizational structure in which the computer systems are being developed. Although artifacts unmistakably provide and limit certain courses of action, this holds even more for the organizational embedding of actors involved in ICT.

In the present paper I will therefore approach the topic of responsibility from an organizational perspective. In order to do so I will focus on communities that work according to the Open Source Software (OSS) model. The reason for choosing OSS communities is twofold. First, OSS communities are the paradigmatic example of non-hierarchical networks, and as such they are prone to the problem of many hands. Second, contrary to the ontological and conceptual complexity Johnson and Powers refer to, which is valid for the whole branch of engineering and technology development, OSS communities are typical for work being done in the ICT sector.

Although originating from communities of ICT hobbyists, the OSS model is increasingly considered a viable approach in commercial settings as well. Moreover, not only is there a tendency for commercial companies to work according to the OSS model, we also see a professionalization and formalization of innovation communities that work according to the OSS model. These communities get involved in professional projects, which imposes demands with respect to the quality of service, reliability and security of the delivered product. However, since these communities are mostly working with volunteers, Service Level Agreements (SLA) or formalized contracts are often lacking. This raises problems for the distribution of responsibilities, especially since the responsibility of software developers is often grounded in their professional role and it is exactly the latter that is often lacking or only vaguely defined within OSS communities. Does the lack of official roles mean that we cannot ascribe responsibilities to OSS developers?

In order to answer this question I will make a distinction between responsibilities that are ascribed retrospectively, which are often the ground for blame, and more forward-looking responsibilities. Whereas most literature seems to be focused on the responsibility for negative outcomes in the past – given some negative outcome, can we hold some agent responsible for that particular outcome? – we can also look at responsibilities in a more prospective way, i.e., to see whether people take up active responsibility. To gain insight in these active responsibilities I will take a more empirical stance and show how responsibilities are actually distributed in OSS communities. I will focus on two examples of OSS communities in particular, viz. the Fedora project and WirelessLeiden, a local user community that created an open wireless networking infrastructure built on Wi-Fi technology in the Dutch city of Leiden and surroundings.

The underlying hypothesis in this paper is that in OSS communities the power relations constituting traditional hierarchical organizations are replaced by relations of trust, which can be characterized by a distinct ethic or value-orientation on the part of exchange partners, a so-called ‘spirit of goodwill’. This ‘spirit of goodwill’ enables the actors to take up active responsibility and a willingness to make investments without contractual guarantees. In addition to strict economic exchanges, ‘trust networks’ are infused with social exchange, entailing unspecified reciprocal obligations. In the creation of relations of trust informal processes, such as committing, seems to play a crucial role.

REFERENCES

Johnson, D.G. & Powers, T.M. (2005), Computer systems and responsibility: A normative look at technological complexity. Ethics and Information Technology, Vol. 7, 99-107.

Engaging the Students of Technology in an Ethical Discourse in the Information Age: Thoughts of Wiener and Gandhi

AUTHOR
Manju Dhariwal, Ramesh C. Pradhan and Raghubir Sharan

ABSTRACT

With Introna, let us consider the possibility and ‘The (im)possibility of ethics in the information age’. He poses the problem as: ‘Ethics is not easy anymore – may be it never was. It seems as if the ethical resources available to the ordinary person are rapidly becoming fragmented, distributed and ambiguous. For one, the traditional sources of moral knowledge such as religion, the state and the family are becoming increasingly elusive as the nature and legitimacy of these institutions are being challenged and transformed… I am merely indicating that the question of right and wrong, of how one ought to live, has, for sometime now, become less and less obvious’ (Introna, 2002). This thought persists even in the limited case when a teacher attempts to engage the students of technology in an accreditation mandated course in ethics. The problem becomes even more severe in a culturally unique country like India.

This problem of loss of meaning in an ethical discourse is not new. It happened to ancient Indian king Ashoka after his victory in the Kalinga war. It is illuminating to notice that the ethical disillusionment of Ashoka appeared at the end of a decisive, though gory, victory at Kalinga in which thousands were killed. The remorse at the emptiness of the victory led Ashoka to deep contemplation. The action that followed is well known in history. It led to adoption and spread of Buddhism over a major part of the world over many centuries. On a lesser scale similar ethical disillusionment visited Norbert Wiener, a well known American intellectual considered to be one of the founding fathers of computer-ethics (Bynum,2008), after the end of the World War II. Wiener had amply contributed to technology that led to decisive defeat of the Axis forces in World War II. Be that as it may, at the end of the World War II, Wiener was thoroughly disillusioned with the unethical behaviour of the war administration. He rebelled against the American State as is obvious from the perusal of a letter “A scientist rebels!” written to the Atlantic monthly (Wiener, 1947). In this letter, Wiener mentions his refusal to communicate one of his work to an engineer from Boeing working on guided missiles and ‘not to publish any future work … which may do damage at the hands of irresponsible militarists’. For this refusal he was hounded by State and in this mood he wrote the book “The Human uses of Human Beings” (Wiener, 1950, 1954). Wiener’s mental disillusionment, but still clinging to some rays of hope, can be surmised by going through the following quotation from the 1954 edition.

Let us remember that the automatic machine…is the precise economic equivalent of slave labour…. However, there is nothing in the industrial tradition which forbids an industrialist to make a sure and quick profit, and to get out before the crash touches him personally…

Thus the new industrial revolution is a two–edged sword. It may be used for the benefit of humanity, but only if humanity survives long enough to enter a period in which such a benefit is possible. It may also be used to destroy humanity, and if it is not used intelligently it can go very far in that direction. There are, however, hopeful signs on the horizon… I have been delighted to see that awareness on the part of a great many of those present of the social dangers of our new technology and the social obligations of those responsible for management to see that the new modalities are used for the benefit of man.. the routes of good will are there, and I don’t feel as thoroughly pessimistic as I did at the time of publication of the first edition of this book” (Wiener, 1954, p 162). (Boldface introduced by the authors)

This hope was expressed in 1954. Events of the following five decades including the ethical behaviour of the industrial and financial elite of the developed world (which has ushered the information age) have not vindicated the hope expressed by Wiener rather these have reinforced the apprehension expressed earlier that ‘there is nothing in the industrial tradition… before the crash touches him personally’

As an aside we mention that in the middle of 1950s Wiener had visited the Indian Statistical Institute at Kolkata (earlier Calcutta) and stayed there for some time. During this period he came across the writings of Mahatma Gandhi. To the best of our knowledge, Wiener has not written anything about this experience. But let us construct, as a mental exercise, the way Wiener after becoming familiar with the thoughts of Gandhi (see Parel, 1997) would have engaged the students of technology in an ethical discourse in the information age. Though difficult, we have attempted this exercise using the ethical codes of IEEE and ACM as pegs. The results will be elaborated in the paper to be presented. The outline of the paper is as follows.

Initially, the basic differences between the world view of Wiener and Gandhi are noted. These differences arise because their upbringing and background were different. Wiener was brought up in a developed country and was engaged in advancing the frontiers of science and technology. He had amply succeeded in this task and had occupied a commanding position. Gandhi was born in India. He was a lawyer by profession who was primarily engaged in a struggle against oppression of downtrodden by a colonial power. Struggle against oppression has been waged several times in history. But the uniqueness of the method of Mahatma Gandhi lies in its reliance on development of moral force by the oppressed themselves and use of only non-violent methods of protest (satyagraha). The aim was to change the heart of the oppressor with love. This method succeeded in obtaining political freedom for India (from British rule) in 1947. But the irony is that by the time Wiener came to India in the middle 1950s, the teachings of Mahatma Gandhi had already substantially weakened in India itself. The emphasis had shifted from development of moral force to development of material progress. The genius of Wiener would have noticed this irony and commented on this. What would have been this commentary? An attempt has been made to construct this. This material can be used to engage the students of technology in an ethical discourse so that they can handle the forward, backward and sideways uses of ICT.

REFERENCES

1. Bynum Terrel Ward ,(2008) Computer and Information Ethics , The Stanford Encyclopaedia of philosophy/winter 2008 Edition, Edward Zelta(ed)

2. Lucas D. Introna, (2002), ‘The (im)possibility of ethics in the information age’. Information and Organization 12 , 71-84, Pergamaon Press, Elsevier Science

3. Norbert Wiener (1947), ‘A scientist rebels!’ The Atlantic Monthly, Jan 1947, Vol.179, p.46. USA.

4. Norbert Wiener (1950,1954), “The Human uses of Human Beings: Cybernetics and Society.” , Da Capo Press, Boston,USA

5. Anthony J. Parel (Ed)(1997,2007) ‘Gandhi: Hind Swaraj and other writings’ Cambridge University Press, New Delhi, India.