The Shaping of Human Cognition by Information Technology

AUTHOR
John Weckert and Craig McDonald

ABSTRACT

There is much discussion about computer mediated communication but less about computer mediated perception and its effects on cognition, concepts and language. In this paper we focus on the relationship between information technology and how humans think, perceive and behave. An understanding of this relationship underpins our capacity to ethically use, or to avoid, technology.

Human experience of the world is the internalising of information, mediated by our conceptual structures, sometimes called our beliefs, opinions, views or values. These structures are the fabric of our thinking selves, the framework within which our perceptions and actions are constructed. They are not static; unlike programs in some sort of cerebral computing unit, they are being constantly built and modified for us by our experience.

But increasingly we work in a designed information environment where the characteristics of the computing technology are imprinted on the information and, through our experience of it, imprinted on us. The very way we think and perceive the world are being affected by information technology. It is possible that our lives are becoming easier in some ways because of Information Technology, but also becoming poorer as we lose touch with the non-virtual world and with more direct ways of perceiving that world. In losing skills, including skills of perception, to technology are we losing important human capacities too?

There is a vast literature which supports the view that language, as well as being a vehicle for communication, also influences how individuals experience the world. (For example see seminal works by Whorf, Wittgenstein, Kuhn and Quine). The basic idea is that words are more than labels which are attached to individual bits of the world. The view that words are just labels has been called the ‘myth of the museum’ (Quine). Language is closely linked with an individual’s conceptual structure. If we know the meaning of a word or phrase or its use, then we have a certain concept or concepts. Commonly, learning the new word, or term, involves acquiring the new concept and in some cases learning to perceive the world slightly differently. Suppose that we are tasting wine with an expert. She talks about acidity, astringency, oak flavours, and the like, of the various wines. She tells us which ones will improve with age, and which should be consumed now. Some of the features which she points out we can recognise, but others we cannot. In many cases we cannot taste or smell what she does. It is not merely a matter of us not knowing the relevant terms, nor of not knowing which attributes indicate aging potential or whatever. Often, even when the feature is explained to us we still not able to taste or smell it. Again with time and help we may be able to acquire this ability, but we certainly do not have it at the moment.

We suggest that Information Technology, considered broadly to include devices containing processing chips, which now pervades much of our lives, influences our cognitive abilities at least partly through mediating our perceptions of the world and partly through pre-processing. There is nothing wrong with a photographer using light meters providing that the skill of assessing light conditions without their use is not lost. The possession of that skill makes for a richer experience of the world. Use instruments rather than sight and touch to assess wool quality, but do not lose the ability to do it through visual and tactile experience. Use automatic pilots, but do not allow human pilots to become mere instrument readers. There are richer experiences in life than the reading of gauges. Use computers to assist in the administration of courses but do not forget that knowledge is too rich to be placed into boxes with obscure codes. Modern information technology is hastening the process of seeing everything in life in terms of the simply measurable. Measurement is useful, but it only concerns a small part of human existence.

Information technology, and the language embedded in it, increasingly stands between our senses and the physical world in which we live. The technology pre-processes data from the world imposing on it the conceptual structures of the technology developer. This is in marked contract to information technology of previous ages. The telescope, for example, enhances the work of the eye, but electronic sensors and systems (for example that report an enemy missile launch) have embedded the concepts of the technology developer and sponsor. The ethical capacity of one acting on information pre-processed by technology is reduced.

The multi-media modes of information technology work in a different manner to interfere with our ethical capacity. Rather than presenting processed measures of the world these technologies modify sensory input itself. Virtual reality augmented reality, enhanced reality and attenuated reality modes can be a more subtle intervention between out senses and the world.

A detailed exploration of the relationship between human cognition and information technology is necessary for a critical assessment of the ethical and social impacts of information technology use. It may be unwise to accept some technology unthinkingly simply because it makes various tasks easier. We might be losing skills that are valuable for our quality of life. At least we should be aware of the social and personal costs involved.

The Ethics of the Hacker Taggers: The New Generation of Hackers

AUTHOR
Matthew Warren

ABSTRACT

In the contemporary world, the latter interpretation is by far the more common (although persons belonging to the former category of hacker would seek to more accurately define the latter group, particularly those with a malicious intent, as ‘crackers’). Hackers are by no means a new threat and have routinely featured in news stories during the last two decades. Indeed, they have become the traditional ‘target’ of the media, with the standard approach being to present the image of either a ‘teenage whiz kid’ or an insidious threat. In reality, it can be argued that there are different degrees of the problem.

Donn Parker (Parker, 1976) highlighted that the individuals involved in computer crime in the 1960’s and 1970’s were employed as key punch operators or clerks in EDP organisation s and the crimes were crimes of opportunity. In the 1980’s with the development of cheaper home microcomputers and modems, a new generation of younger computer users emerged. One of the features of this younger group was a keen interest in the technologies that lead to the development of hackers.

Steven Levy’s book Hackers: Heroes of the Computer Revolution (Levy, 1984) suggests that hackers operate by a code of Ethics. This code defines main key areas:

  • Hands On Imperative: Access to computers and hardware should be complete and total. It is asserted to be a categorical imperative to remove any barriers between people and the use and understanding of any technology, no matter how large, complex, dangerous, labyrinthine, proprietary, or powerful.
  • “Information Wants to Be Free”: can be interpreted in a number of ways. Free might mean without restrictions (freedom of movement = no censorship), without control (freedom of change/evolution = no ownership or authorship, no intellectual property), or without monetary value (no cost.)
  • Mistrust Authority: Promote decentralisation. This element of the ethic shows its strong anarchistic, individualistic, and libertarian nature. Hackers have shown distrust toward large institutions, including, but not limited to, the State, corporations, and computer administrative bureaucracies.
  • No Bogus Criteria: Hackers should be judged by their hacking, not by ‘bogus criteria’ such as race, age, sex, or position.
  • “You can create truth and beauty on a computer.” Hacking is equated with artistry and creativity. Furthermore, this element of the ethos raises it to the level of philosophy.
  • Computers can change your life for the better. In some ways, this last statement really is simply a corollary of the previous one. Since most of humanity desires things that are good, true, and/or beautiful.

During the 80’s and 90’s this pure vision of what hackers are was changed by the development of new groups within various aims and values. Mizrach (1997) states that the following individuals currently exist in cyber space:

  • Hackers (Crackers, system intruders) – These are people who attempt to penetrate security systems on remote computers. This is the new sense of the term, whereas the old sense of the term simply referred to a person who was capable of creating hacks, or elegant, unusual, and unexpected uses of technology.
  • Phreaks (Phone Phreakers, Blue Boxers) – These are people who attempt to use technology to explore and/or control the telephone system.
  • Virus writers (also, creators of Trojans, worms, logic bombs) – These are people who write code which attempts to a) reproduce itself on other systems without authorisation and b) often has a side effect, whether that be to display a message, play a prank, or destroy a hard drive.
  • Pirates – Originally, this involved breaking copy protection on software. This activity was called ‘cracking’. Nowadays, few software vendors use copy protection, but there are still various minor measures used to prevent the unauthorised duplication of software. Pirates devote themselves to thwarting these and sharing commercial software freely.
  • Cypherpunks (cryptoanarchists) – Cypherpunks freely distribute the tools and methods for making use of strong encryption, which is basically unbreakable except by massive supercomputers. Because American intelligence and law enforcement agencies, such as the NSA and FBI, cannot break strong encryption, programs that employ it are classified as munitions. Thus, distribution of algorithms that make use of it is a felony
  • Anarchists – are committed to distributing illegal (or at least morally suspect) information, including, but not limited to, data on bomb making, lock picking, pornography, drug manufacturing, and radio, cable and satellite TV piracy.
  • Cyberpunk – usually some combination of the above, plus interest in technological self-modification, science fiction and interest in hardware hacking and ‘street tech’.

But what about the hacking situation in the 2000’s. Professor Warren will discuss a new hacking sub groups that exist. This group are “Hacker Taggers”, these hackers deface web-sites with the sole intention of leaving a “Hacker Tagger or calling card” behind. This “tag” is updated against their score and their score is updated in their hacking competition. These hackers are focused on hacking as a competition and who will be the winner. The media has often mis-reported these activities as mass hacking or cyber terrorism. The presentation will focus upon the ethical views of this new hacking sub group and the impact that they have caused and the particular issues that this sub-group poses.

REFERENCES

Levy, S (1984). Hackers: Heroes of the Computer Revolution. Penguiun, ISBN 0385312105.

Mizrach S (1997) Is there a Hacker Ethic for 90s Hackers? URL:

Freedom, Intellectual Property, and the Flow of Information

AUTHOR
Richard Volkman

ABSTRACT

Freeculture.org asserts, “A free culture is one where all members are free to participate in its transmission and evolution, without artificial limits on who can participate or in what way.” I argue that this worthwhile goal is inconsistent with the wholesale rejection of intellectual property. While current legal justifications and implementations of intellectual property are antagonistic to free culture, an examination of the general purpose and justification of property—especially drawing from the work of John Locke and F.A. Hayek—reveals that one cannot enjoy a fully free culture without legal protections that permit authors to set the terms for the distribution and reproduction of their works.

To understand why intellectual property is a necessary condition of free culture, one must understand markets as more than merely economic institutions. Despite the explicitly consequentialist justifications usually trotted out for intellectual property, our intuitions about property are not all accounted for in narrowly economic terms, and economic terms are never the end of the story in any case. As Hayek aptly notes, “Economic considerations are merely those by which we reconcile and adjust our different purposes, none of which, in the last resort, are economic.” To get to the bottom of property, we need to investigate the relationship between property and the projects and purposes that constitute our very lives.

These considerations indicate the value of establishing institutions that provide individuals with the incentives and the means to access, process, and respond to information about the best ways to achieve their various ends. In a nutshell, assignment of property rights is necessary for a flourishing market, and a market provides the institutional means for leveraging the massive amounts of implicit, tacit, and distributed information necessary for discovering and living the good life. In this light, the purpose of property rights is for setting up and regulating the market as an information processing system that processes and distributes the information relevant to our ability to “reconcile and adjust our different purposes.”

In light of this, the critique of intellectual property typically errors by only addressing narrowly economic concerns. Against standard utilitarian defenses of intellectual property, it is often pointed out that information artifacts are non-rival goods and their distribution costs have become negligible. For example, since your having a copy of my song in no way diminishes my ability to use that song, and since nowadays there is little or no need to provide incentives to middle-men to distribute or produce the song, there are no overall negative consequences to doing away with the current financial incentives offered by intellectual property. Assuming that artists do not require strong economic incentives to do their best work, providing consumers with the legal right to make unauthorized copies does no harm to the art and no injustice to artists. In the information age, there is no need to create artificial incentives for distributors either. If the incentives of markets ever made sense, the information age has rendered them superfluous or even detrimental. After all, attention to the marketability of a work can negatively impact the work.

However, it is precisely the impact of the market on the work itself that means a fully free culture needs intellectual property. This is obscured in the critique, which narrowly addresses itself to economic incentives, missing the much more important role that markets play in gathering, processing, and distributing information in a readily usable and unequivocal form. In this sense, a market is an information resource for artists and patrons alike. The sort of market created by the assignment of intellectual property rights does a tolerably good job of gathering, processing, and making available usable feedback from an artists’ target audience. While alternative mechanisms may have a role to play, there is an undeniable advantage to a system that requires the would-be critic to “put your money where your mouth is.” Moreover, it should be emphasized that, since the relevant sense of a market is not limited to economic buying and selling of artifacts, but extends to evaluating ways of life through all sorts of distributed institutions for distilling the “wisdom of crowds,” the relevant feedback is by no means crassly or narrowly economic. Whatever market or market-like mechanisms one chooses for feedback, I will show that they can only operate on the assumption of various intellectual property rights—though unpacking those rights also vindicates the need for significant reform.

Surely there will be artists who prefer to eschew any such a feedback device, even while others embrace it, but that is exactly the point. Creators need the freedom to judge for themselves the best mechanisms of evaluation. A Lockean account of intellectual property rights is uniquely suited to defending such freedom, permitting and even encouraging alternatives to economic markets alongside other institutions. If music that is “free as in beer” really is as good or better than music owned and distributed under a reformed intellectual property regime, then we need a market to provide the feedback that proves this. Just as a Lockean account is consistent with advocacy of Open Source software, along with the recognition that some software is better when it is proprietary (a point repeatedly made by Open Source advocate Eric Raymond), so too is a Lockean account consistent with allowing artists maximum flexibility to determine the appropriate feedback and distribution networks for their creations. While intellectual property protections may have restricted artists’ choices in the broadcast age, in the information age it is more restrictive to eliminate such protections for those who desire them. Surely it would be a grave mistake to trade away, in the interests of a “free culture,” the very institutions that facilitate the freedom of artists to create as they see fit and under conditions of their own choosing.

Network Neutrality: Ethical Issues in the Internet Era

AUTHOR
Matteo Turilli, Antonino Vaccaro and Luciano Floridi

ABSTRACT

The paper investigates the ethical nature of network neutrality in relation to its application to the Internet. Three main questions are addressed:

  1. What is the ethical nature of Internet neutrality?
  2. Should Internet neutrality be endorsed when considering its ethical implications?
  3. What ethical framework should be endorsed for regulating Internet traffic?

Specifically, we argue that network neutrality is not an ethical principle per se and that it does not directly enable or substantiate ethical principles. Consequently, network neutrality should not be considered dogmatically, as done in previous literature (e.g. Goldsmith and Wu, 2006; Wu, 2005), but rather should be evaluated pragmatically when applied to the Internet. An analysis of the parameters that to evaluate the quality of Internet services from the user’s perspective uncovers how, in many cases, implementing Internet neutrality may breach ethical principles. So we argue that a set of coordinated policies would be preferable in order to regulate Internet traffic, instead of a neutral approach. We propose an ecological ethical framework, that accord competing interests and considers the effects of stakeholders’ actions on each other, in order to avoid the potential unethical consequences of Internet traffic regulation.

The paper is organised into four sections. The first critically analyses the concept of network neutrality. Following previous literature (e.g. Wu, 2005; Yoo, 2005), a network is qualified as neutral if and only if all transactions are performed under the same set of criteria. For example, the United Parcel Service (UPS) network can be said to be neutral only if packages are sent to and from any user by applying the same criteria expressed in terms of priority, checking-in and checking-out procedures, carrier typology or pricing. Such a definition does not depend on the topology of the network or the technology involved. It is therefore readily applicable to all networks, including the Internet. Clients, servers and routing-related devices coordinate and perform transactions of data packets through the Internet. Such transactions are regulated by communication protocols that determine the route of a data packet depending on its destination and properties. The Internet is neutral if and only if all data packets are transmitted with the same priority, irrespective of their properties.

The second section of the paper investigates the ethical implications of implementing network neutrality into the Internet. The ethical myth of Internet neutrality is debunked showing how a neutral Internet can be used unethically by offering unfair services to a variety of stakeholders. It is argued that a neutral Internet does not allow for a quality-based prioritisation of the traffic and, as such, impairs some services and their users while unfairly favouring others. For example, in a condition of neutrality, P2P services can augment the latency of real-time, interactive services such as VoIP or remote shell connections. In such contexts, Internet neutrality supports unfair and discriminatory usage of resources based on uncontrolled competition. Therefore we suggest that, in order to avoid such negative consequences, Internet neutrality should be dropped in favour of a controlled, service-oriented prioritisation of the traffic.

The third section shows that, while a neutral Internet does not guarantee an ethical service and still allows for unethical practises, regulative policies raise important ethical issues if they are not devised in the context of an appropriate ethical framework. In particular, the discussion leads to the analysis of four main problems associated with the development of a regulative internet policy: (1) appropriation of the communication resources in favour of a pay-per-transaction model; (2) deterioration of the standard quality of service; (3) packet routing discrimination; and (4) unfair intra-service competition. These consequences would promote unethical conditions for Internet services, such as unfairness, discrimination, digital divide or lack of freedom for developers and users.

The fourth and final part of the paper analyses how the adoption of an ecological-ethical framework during the phase of policy-making would minimise unethical consequences in Internet regulation. An ecological approach requires one to consider the interplay between the interests of all the parts involved in the use, maintenance, development and monitoring of the Internet. The goal is to guarantee and balance different interests against each other, carefully evaluating the effects of each choice on all the stakeholders.

This approach has been successfully deployed in many existing communications networks (for example, private national highway systems, private national power-grids, world-wide transportation infrastructures) and we provide a detailed description. We then argue that the separation and ‘ecological’ regulation of different Internet services offered, in addition to a shared and possibly nationally controlled communication infrastructure, would make it possible to control unethical practises and optimise available resources.

The paper ends with two explanatory examples. They point out how Internet regulation, as opposed to the neutrality paradigm, can be leveraged in order to guarantee both ethical requirements and more efficient communications and services. The first example shows how to implement a policy to avoid spam by introducing subscription-based mail services modelled on physical mail delivery systems. The second example illustrates how a resource redistribution schema could guarantee a standard quality of service comparable to the one enjoyed today by Internet users.

REFERENCES

Goldsmith, J., Wu, T. 2006. Who Controls the Internet? Illusions of a Borderless World, Oxford University Press, Oxford.

Wu, T. 2005. Network Neutrality, Broadband Discrimination, Journal of Telecommunications and High Technology Law 2, 141-78.

Yoo, C. S. 2005. Beyond Network Neutrality. Harvard Journal of Law and Technology, 19(1), 1-24.

Toward a Global Information Ethics: Some Confucian and Aristotelian Considerations

AUTHOR
Jin Tong

ABSTRACT

Information technology is changing the world, and cyberspace crosses borders between countries and cultures. A number of ethical issues are raised by the border-crossing nature of cyberspace. To deal with this ethical challenge, new policies (including, perhaps, new laws) are needed. The necessary cross-border policies should be based upon a “global”, cross-cultural ethics, and recent computer-ethics research regarding “global information ethics” can be helpful. Because human beings share a common human nature, our understanding of human autonomy, and its dependence on the acquisition and processing of information, provides a good starting point for research on global information ethics. The present paper focuses upon the examples of Confucian ethics and Aristotelian ethics in the search for a global information ethics. The paper is part of a larger project to identify and explore a common ethical foundation, based upon human nature, for all the great ethical traditions around the globe, both East and West.

Confucianism presupposes that all human beings are similar in nature. The Confucian thinker, Mencius, for example, emphasized that all humans are potentially good because, by nature, they all have four “seeds” or “roots” of moral virtue. These common roots give humans the potential to (1) set their will to become virtuous, (2) train their emotions, (3) engage in appropriate reflection and thinking, and (4) engage in appropriate actions. The Confucian thinker Xunzi also recognized the potential of all humans to become virtuous, but he focused instead upon his concern that humans start with an animal nature that can make them evil unless they use the power of their will and appropriate education, habituation and ritualization to become good.

Aristotle’s account of human nature and human virtue has much in common with Confucianism. For example, Aristotle also believed that humans share a common nature that gives them the potential to be virtuous. In addition, Aristotle also assumed that humans are born with an animal nature (he defined man as “the rational animal”) that can lead to evil, unless a person is properly educated, habituated and enlightened.

The common underpinning of Confucian and Aristotelian ethics is the understanding of human beings as autonomous agents, taking responsibility for their own actions and thereby determining whether or not they will be virtuous citizens. In this information age, a global ethics based upon this common understanding of moral excellence can, perhaps, be the foundation of a global information ethics that enables a worldwide ethical conversation on the Internet among all the cultures of the globe.

IT Professional: Working Beyond Technology

AUTHOR
J. Barrie Thompson

ABSTRACT

As stated in the call for papers “The information revolution has become a tidal wave that threatens to engulf and change all that humans value. Governments, organisations and individual citizens therefore would make a grave mistake if they view the computer revolution as merely technological. It is fundamentally social and ethical.” it is also made clear in the call that IT technology is a facilitator of social interaction, human endeavour and environmental wellbeing. However, to simply consider the technology is insufficient. It is essential that we consider both those who are developing the underlying technology itself and those who are using the technology in developing and supporting the IT systems on which so much of the world depends. It is people who are important and they must work beyond the technology, for as was highlighted in a recent high profile report [1]:

“A striking proportion of project difficulties stem from people in both customer and supplier organisations failing to implement known best practice. This can be ascribed to the general absence of collective professionalism in the IT industry, as well as inadequacies in the education and training of customer and supplier staff at all levels.”

The failure of many software projects to meet their objectives, or indeed the termination of partially completed projects, is an all-too-often occurrence. The ongoing problem of poor quality software has been repeatedly highlighted in published studies (e.g. [2]), and in major conference presentations (e.g.. [3]). The cost of these failures is enormous: a recent article [4] reported that in the UK, between 2000 and 2007, the total cost of abandoned Central Government computer projects had reached almost two billion pounds. These ongoing problems have obviously acted as a catalyst for particular national computing bodies to address professionalism in a proactive manner. In particular, the British Computer Society has undertaken, since 2005, an ambitious three-year managed programme [5] (named ProfIT) that has two key objectives:

  1. By increasing professionalism, to improve the ability of business and other organisations to exploit the potential of information technology effectively and consistently.
  2. To build an IT profession that is respected and valued by its stakeholders – government, business leaders, IT employers, IT users and customers – for the contribution that it makes to a more professional approach to the exploitation and application of IT.

The success of the BCS ProfIT effort can be judged from the fact that since January 2007 the International Federation for Information Processing (IFIP) has been working with the BCS and other professional bodies to develop an augmented international programme which has been named [6] the International Professional Practice Programme – I3P. The programme is intended to establish an international grouping to speak globally about issues relating to the profession and ensure that the voice of the ICT practitioner is clearly and powerfully expressed. There is also an aim to create a globally recognised accreditation, provisionally named the International IT Professional (IITP).

This paper will build on a paper published at Ethicomp 2007 [7] which charted earlier global and national developments relating to professionalism in the ICT sector and examined the first 18th months of work that had supported the ProfIT programme. The paper will cover the completion of the ProfIT programme and chart the latest developments relating to IFIP’s International Professional Practice Programme and related accreditation for the International IT Professional. It will also provide a critical appraisal of the likely effectiveness of these initiatives and finally an evaluation will be presented to assesses whether we are approaching a situation where IFIP’s definition of a professional, viz.

  • Publicly ascribe to a code of ethics published within the standard.
  • Be aware of and have access to a well-documented current body of knowledge relevant to the domain of practice.
  • Have a mastery of the body of knowledge at the baccalaureate level.
  • Have a minimum of the equivalent of two years supervised experience before the practitioner operates unsupervised.
  • Be familiar with current best practice and relevant proven methodologies.
  • Be able to provide evidence of their maintenance of competence.
    represents reality.

REFERENCES

[1] Royal Academy of Engineering , The Challenges of Complex IT Projects, Report of a working group from The Royal Academy of Engineering and The British Computer Society, 2004, available from: http://www.bcs.org/upload/pdf/complexity.pdf [accessed October 12 2006].

[2] R. L. Glass, Facts and Fallacies of Software Engineering, Pearson Education, Boston, 2003.

[3] C. Hughes C. (2006), Professionalism in IT, Keynote Address, 19th IFIP World Computer Congress (WCC 2006), Santiago, Chile, August 20-25, 2006, Presentation available from:

[4] B. Johnson And D Hencke, Not Fit For Purpose: £2bn Cost Of Government’s IT Blunders,Guardian, Saturday January 5, p11, 2008.

[5] BCS Professionalism in IT Programme, covered in a series of articles in the May 2006 issue of IT NOW, British Computer Society, Swindon, UK.

[6] Hughes C. (2007), International Professional Practice Programme – I3P, IFIP News, September 2007, P5, available from http://www.ifip.org

[7] Thompson, J. B. (2007), Globalisation and the IT Professional, 9th International ETHICOMP Conference, Meiji University, Tokyo, 27 to 29 March 2007, Proceedings pp. 564-575.