Eeny, Meeny, Miny, Masquerade! Advergames and Dutch Children; A Controversial Marketing Practice.

Isolde Sprenkels and Dr. Irma van der Ploeg


In a society increasingly inundated with digital technology, children in the Netherlands learn from a very young age how to use new information and communication technologies (ICTs). These technologies offer them ways to play, learn, explore and develop their sense of identity, as well as interact and communicate with adults and peers. Children spend ever more time in front of computer and mobile screens with gaming as one of their favourite activities. One type of game many children enjoy playing are online casual or mini games. These short, ‘free’ and easy to learn games have friendly designs with bright colours and fun tasks to perform, and are developed to entertain, educate or deliver a particular commercial message. This paper focuses on the latter ‘advertisement as game’ that is developed around a particular brand or product and which can be described as an ‘advergame’.

Advergames are used by companies to build brand awareness, prolong contact time, stimulate product purchase and consumption, drive traffic to a brand’s website, generate consumer data and build and expand digital profiles of consumers. Especially when played by children, these advergames can be considered to be problematic and controversial, as they are seen to exploit children by taking advantage of their state of psycho-social development and by integrating unseen technological features. They bring together several issues related to identity, consumption, marketing, profiling and datamining. Using insights from surveillance studies, science and technology studies, (sociological) studies of identity construction in relation to ICTs, and studies on children and consumption, this paper will analyse several advergames targeted to Dutch children. It examines how this new form of marketing communication fits into corporate objectives and why this can be considered controversial with children.

First, advergames will be examined against a discourse that suggest that it is immoral to economically exploit children; that children are considered vulnerable and it is inappropriate to take advantage of this vulnerability by using sophisticated marketing strategies on them. As children’s cognitive skills are not yet fully developed and they have little life experience, their ability to interpret and assess commercial messages is limited. This makes persuasive strategies unethical as children are still in the process of distinguish messages and unable to make choices that would protect themselves from certain forms of marketing manipulation (Moore 2004; see also Buijzen and Valkenburg 2003).Research has shown that children find it difficult to distinguish between advertising and editorial content in online environments (Nielsen 2002; Mijn Kind Online 2008). There is also an increasing lack of parental supervision in children’s use of the internet (Qrius 2007). This implies that many children are on their own when it comes to identifying commercial content online and developing digital information skills. Codes of conduct such as the Dutch Advertising Code prescribes that the distinction between advertising and editorial content should always be made recognizable. However, when it comes to advergames, this distinction is not made explicit in any way, making it a very difficult task for children to discriminate between an advertisement and entertainment in these ‘seamless environments’ (Moore 2004).

Arguably, this is part of a marketing strategy. Eliminating the recognition or identification of the commercial message and marketer practitioners’ intentions and tactics fits the strategy of ‘kidsmarketing’ to tailor messages, design products, packages, websites and advertisements in a way that appeals to childrens’ ‘wants and needs’, and are identifiable to them, with ‘play and fun’ at its core (Cook 2010). Advergames appear to be the ultimate form of this ‘play and fun’ approach; a ‘masquerade’, where marketer practitioners hide behind a screen full of play and fun, allowing them to reach their own commercial goals in the meantime. More specifically, while advergames may be seen as an opportunity to play something fun for free, children remain unaware of the commercial intent and manipulation behind the (adver)game that can be seen to mediate and even transform their play, their sense of self and their understanding of the world around them. Not only they are offered what they ‘want and need’ following the viewpoint of the marketer practitioner, what they ‘want and need’ appears to be produced by this very same strategy.

Second, in order to reach corporate goals such as building brand awareness, stimulating consumption, and generating consumer data, certain features are designed into advergames and will be taken into account. A study on children and advergames shows that many of these advergames include features to encourage children in repeat play and product purchase by offering such things as multiple game levels, public displays of high scores and game tips within product packages (Moore 2006). Another study indicated that there is a relationship between the capacity of the advergame to induce a state of flow, a mental state of subjective absorption within an activity, and a change in the buying behaviour of (in this specific case adult) players (Gurau 2008). Advergame research also shows how some of these games include product related polls or quizzes, offering valuable information for market research on children’s habits and preferences (Moore 2006; Grimes 2008). They may also encourage players to register and share their gaming experience with friends or family, collecting personal identifiable information (Gurau 2008). Combined with an analysis of in-game-behaviour and activities, marketers are able to construct detailed consumer profiles, based on the aggregation of these behavioural and demographic data (Grimes 2008; Chung & Grimes 2005). Through this, advergames can be described as ‘electronic surveillance devices’, as they enable a new form of tracking children’s activities. In addition, studies on online communities for children and advertising discuss marketers using immersive advertising campaigns such as advergames, encouraging children to play with particular products, enabling them at a later point in time to identify the brand (Grimes & Shade 2005), and to create a ‘personal relationship’ with the product (Steeves 2006). They teach children to trust brands, consider them their friends, not only recommending products, but becoming ‘role models for the child to emulate, in effect embedding the product right into a child’s identity’ (Steeves 2006).


Chung, G. & Grimes, S. (2005) ‘Data Mining the Kids: Surveillance and Market Research Strategies in Children’s Online Games’, Canadian Journal of Communication, vol. 30, no.4, pp. 527-548.

Lessons Learnt from the Past: Reflection on Working for Families Projects in Scotland on Ethnic Minority

Nidhi Sharma and Shalini Kesar


Keeping in mind lessons learnt from the previous research presented at the last two ETHICOMP conferences, this paper reflects on the most current project (Working for Families Project: 2009-2012) funded by European Social Funds and Dundee Partnership . This is phase III of an on-going research. Recognizing the success of the previous Working for Families Project (WfFP), reflected in phase I and II, a new WfFP was initiated that focuses on various issues in the context of reducing employability gap currently existing in Scotland. Results of the WfFP reflected in this paper (phase III) focuses on ethnic minority in Dundee. The main goal of this project is to train people from ethnic minority group to enhance their basic skills, including Information, Communications and Technologies (ICT) and literacy and numeracy.

In doing so, this paper conducted two groups of interviews. Group I, included same set of women, who are currently working or enrolled in higher education after receiving training/services from earlier WfFP (phase I and II). Whereas Group 2, included new set of people from ethnic minority, who are currently enrolled in this project. This was for two reasons. Firstly, we would be able to better identify and thus compare and contrast their barriers from employment point of view. Secondly, feedback from Group I will help us to further modify the existing training and service delivery to better suit the needs of Group 2 while trying to obtain employment. This is important as funding for the following years depends on the success of this project. In other words, funding depends on the number of people from ethnic minority who actually obtain employment or go into higher education. This is monitored by the government and like funding authorities.

Working for Families Project was initiated in early 2000s by the Scottish Executive where the goal was to support vulnerable or disadvantaged parents towards, into or within employment by breaking down childcare or other barriers. It underpins the Scottish Government commitment to tackling child poverty). WfFP also aims to tackle additional employability barriers such as; low skills, lack of confidence, transport, debts, substance misuse issues, and other care responsibilities . The target groups for the initiative were: Lone parents; Ethnic minority and Parents with other stresses in the household which make it difficult to sustain employment (for example, disability, mental health, family break-up and drug and alcohol problems). Main services offered were:

  • Employability Support Team – deal directly with many clients and signpost them to an appropriate Link Worker or for specialist help
  • Link workers – central to WFF with roles as recruiters, providers of guidance and advice, signposting clients to relevant employment, education and training opportunities.
  • Money Advice Support – provide a range of services including; benefit checks, and better off calculations.
  • Access to Childcare- WfFP Staff can assist clients in finding suitable childcare to enable access to work, education or training
  • Training & Education – WfFP provides a range of opportunities to improve skills and employability
  • ICT Training.
  • Dundee College – provide a range of career focused taster courses
  • Financial Assistance – many WFF clients are eligible for assistance from one of the WFF client funds
  • Childcare Subsidy Fund – provides assistance to clients who are starting work and need help
  • Barrier Free Fund – this can help clients with non-childcare related expenses

Although the main objectives of Phase III (current project) is the same as previous WfFP, the main difference in this project is that tools and techniques are being modified by keeping in mind the findings of phase I and II of this research. Kolb’s cycle is used as a way to reflect and hence outline lessons learnt from different phases of this project. Table below summarizes the findings of this paper so far.

Engineering Ethics for an Integrated Online Teaching: What is Missing?

Montse Serra, Josep M. Basart and Eugènia Santamaria


Engineering graduates are —and will be— facing increasingly complex ethical and social issues in their work. Certainly, laws, professional regulations and codes of ethics can help when addressing this strong challenge, but the utility of these policies and resources depends on whether these future professionals understand where and how to take them into account. Accordingly, a well-founded education in professional ethics is required for future engineers. Nevertheless, in spite of the expectations and demands of an ever-changing society, the incorporation of courses on ethics into engineering curricula is often a concession instead of a common academic requirement.

Thus, from any concerned educational approach is necessary a claim for ethics to show how to develop engineers’ work in an ethically and socially responsible way, because it is apparent that ethical issues are inherent to their profession (Huff and Frey, 2005).

Designing an effective ethics introduction into the academic curriculum is more difficult than teachers are able to imagine and admit, particularly where undergraduate students are considered. From our point of view, several constraints and resistances are present that deserve a special attention:

  • As our society becomes more and more dependent on technology, the role of the engineer’s figure is accentuated and his/her responsibilities (Pritchard, 1998) amplified. So engineering instructors find it difficult to know how to weave applied ethics into a curriculum already full of technical subjects which are (all) considered intrinsic to a course.
  • Spreading ethics across the curriculum asks for the contribution of both experts versed in different relevant areas of the technical or engineering sciences and experts from the humanities and social fields, in order to achieve the expected goals. This collaboration is not always welcomed by either of them and is never straightforward.
  • The existence of some doubts and objections inside the teaching staff about whether ethics can be taught at all. Even less to grown-up people who are supposed to know the difference between right and wrong.
  • Under the influence of both, their social environment and the one they find in technical schools themselves, engineering students often think that ethical contents are not really relevant to their own field of study (Fleischmann, 2006).
  • Finally, the frequent clash between students’ scepticism towards learning ethics and teachers’ conviction of its advisability, asks for a constant weighing up and adaptation of, which contents to teach, which methodology to apply, which educational and technological resources to use, and which teaching staff.

To carry out a discipline such as engineering ethics within an online environment drags other constraints that are endemic to this context, and these special characteristics must be considered when developing any learning process. Teaching within an online environment (Rodríguez, Serra, Cabot and Guitart, 2006) is a social process which requires a specific setting, involving technological platforms and methodological tools, in order to facilitate online interaction such as the discussing of ideas and practising behaviours, the developing of attitudes and skills for, finally, promoting an experiential and active learning (Sieber, 2005). In the case of engineering ethics these goals provide a challenge to educators to focus on real-world problems and practical solutions, when these requirements are not easy to meet within an online learning context (Demiray and Sharma, 2009).

Within this framework what is needed, therefore, is an examination of the teaching methodology and its performance in practice when ethical subjects are considered. Our proposal here is to show how learning tools as dialogue (Serra and Basart, 2010), moral reasoning and judgemental language work and how they are reshaped in this new environment. It involves analysing the essentials requirements of these communication tools (i.e., genuine listening, attention in a virtual context, non-conditioned thinking, and open mind). Additionally, solving moral conflicts requires appropriates strategies, so, a heuristic analysis will be under consideration taking into account the above mentioned learning tools. Finally, as an integrator element, we show how the interaction is developed along the learning process, inside a social net, by means of the previous tools.

It is important to emphasize that, thanks to these communication tools, the network communities created within an online context, learn within a group, constructing the knowledge collectively, and contributing the tacit knowledge (Bohm, 1996) of the community where their members participate.


Bohm, D. “On dialogue”. Nichol Lee editor. Routledge, London, 1996.

Demiray, U. and Sharma R.C. “Ethical Practices and Implications in Distance Learning”. Information Science Reference. Hershey, New York, 2009.

Fleischmann, S.T. Teaching Ethics: More Than an Honor Code. Science and Engineering Ethics, 12, 381–389, 2006.

Huff, C. and Frey, W. Moral Pedagogy and Practical Ethics. Science and Engineering Ethics, 11, 389–408, 2005.

Pritchard, M. S. Professional responsibility: Focusing on the Exemplary. Science and Engineering Ethics, 4, 215–233, 1998.

Rodríguez, M.E., Serra M., Cabot J. and Guitart, I. “Evolution of the Teachers’ Roles and Figures in E-learning Environments”. The 6th IEEE International Conference on Advanced Learning Technologies (ICALT 2006). Proceedings of the 6th IEEE International Conference on Advanced Learning Technologies, IEEE Computer Society Press, 512–514. Kerkrade, The Netherlands, 2006.

Serra M. and Basart J.M. “A dialogical approach when learning engineering ethics in a virtual education frame”. Proceedings of Ethicomp 2010 – The “backwards, forwards and sideways” changes of ICT, 483–490. Universitat Rovira i Virgili, Tarragona, Spain, 2010.

Sieber, J.E. Misconceptions and Realities about Teaching Online. Science and Engineering Ethics, 11, 329–340, 2005.


Dr. Toni Samek and Dr. Ali Shiri


Contributions to information ethics occur between disciplines, across different disciplines (e.g., computer science, gender studies, law, business), and even beyond disciplines. And because information work is often political it is important for educators to examine, explore, and teach a range of social responsibility and ethical implications as reflected in an increasingly intense information society. Looking through the specific lens of the North American library and information studies landscape, we can see that teaching and scholarship are heavily weighted to techno-managerial curricular design and research. However, broadly in society, social responsibility, social justice, and global information justice movements blend people and concerns for the human condition into theories and practices of social computing applications and environments. Our contribution is a knowledge mapping of social responsibility in an information intensive society and the final product that we hope to share with ETHICOMP is a taxonomy.

Dr. Samek’s ongoing immersion and scholarship in human rights forms the basis for our taxonomic content. In her scholarship, she studies evidence of voices and other human traces that reflect contemporary local, national and transnational calls to action on conflicts generated by failures to acknowledge human rights, by struggles for recognition and representation, by social exclusion and by library and related cultural institutional roles in these conflicts. Through content analysis of human rights literature (including workbooks) she collates terms (e.g. protest, human security, survival) that she then tests out for matches in global library and information worker advocacy and activism. For example, for human rights terminology such as “revitalization” and “human security” she points to such activities as the Joint UNESCO, CoE and IFLA/FAIFE Kosova Library Mission. Dr. Shiri’s intellectual contribution draws on his sophisticated research in the development and evaluation of knowledge organization systems such as thesauri and taxonomies. Using facet and subject analyses, his work shapes the foundation for the design of the underlying framework and knowledge structure of our taxonomy.

Some knowledge organization systems have been developed for the analysis and documentation of human rights literature, such as Human Rights Thesaurus and Human Rights Documentation Classification. Our taxonomy is different from these kinds of tools in that it addresses and encompasses the information-focused themes and terms evidenced in global social responsibility initiatives and emergent social computing applications. Herein, our knowledge mapping aims to provide a deeper, more comprehensive, and intercultural snapshot of social media and social computing technologies within these broader contexts.

We propose ten high level categories (e.g., communities, social computing applications, activities and operations) that reflect prevalent contemporary aspects of social responsibility in information society. We also assign each of these ten categories a specific set of sub-facets and terms that reflect concrete actions – both physical and digital – and perhaps most interestingly in the emergent realm of digital human connections and exchanges. And we situate this work in the trans-disciplinary communities of scholarship with a common interest in information ethics, social responsibility and computer ethics.

We hope that by introducing our taxonomy to the ETHICOMP community we can receive direct and diverse feedback to help us move forward in the development of a more refined and inclusive iteration that can be used for the organization, sharing and searching of physical and digital information by multiple stakeholders in society. Here below is a version of our first-stage taxonomy (though not in its complete form for the purposes of word count).
Table fig1

What do we Take? What do we Keep? What do we Tell? Ethical Concerns in the Design of Inclusive Socially Connected Technology for Children

Janet C Read and Maija Fredrikson


Designing great computer systems requires attention to many things. In this paper, the focus is on the design of a mobile technology for children that was aimed at providing an inclusive approach to music making that would enable children who would perhaps be otherwise excluded, to feel more attached to the others around them and to experience feelings of self worth. The two stages of design being considered for this paper are the involvement of children during early design work, and the design of security and alert systems in the interactive product. Both of these stages raised ethical dilemmas that the project team had to find solutions for.

Including children as participants in the design of their own technologies takes its inspiration from the early work on participatory design (Schuler and Namioka 1993) as well as from more recent work on children as design informants (Read, Gregory et al. 2002), (Scaife, Rogers et al. 1997; Druin 1999). In a typical session of this kind, children are given some information about the problem being designed for and are then given activities that collectively gather ideas for features, for the look, and for the fun aspects of an eventual product (Theng, Nasir et al. 2000; Mazzone, Read et al. 2008). Several commentators have considered what the value of these design sessions is by examining the value to the children, the value to the development team and the value to the adult participants (Mazzone, Read et al. 2008). The ethical problems associated with this type of activity mainly centre around the extent to which the children understand their participation. It is highly possible that children may not fully understand what their ideas are being used for, what the overall project is about or the extent to which their work will be used at all.

As a result of carrying out these sorts of activities within the UMSIC project, where the participatory activities were carried out both in the UK and in Finland, we have developed a protocol for ‘Honest Research’ with children. This protocol demands that children are kept fully in the research loop by being given clear information at the beginning of a project that outlines why they are participating, by being given specific appropriate feedback from each individual design session that outlines what was taken from it, and by being able to see, and critique all outputs from the design sessions whether these be academic papers or interactive products. In carrying out this protocol the research team are seen to be more cautious about what they do, more attentive to detail in regards of what they say about the design sessions, and more respectful of the children’s views. In the UMSIC project, where possible, children have been shown the eventual product that was developed with their help.

Our second problem space in designing connected technologies for children is associated with the use of passwords and security systems and in making what should be easy to use systems secure as well as understandable. In many instances, users of computer technology are unaware that they are connected to other machines; they are also often unaware of what data is being taken from one place to another. It is clear in our work that children should be kept informed about whether or not they are connected to each other, about where their data may go, and about the possible dangers associated with their connectedness. It is also clear to us, however, that most children are rather unconcerned with security (Read and Beale 2009) and want it to be invisible whereas the parents and guardians of these children, in determining what technologies their children may be using, want to see security systems and want to see these at work in order to ‘trust’ the product (Gefen, Karahanna et al. 2003). The more security that is put into the product, the more unusable, and unattractive, it might become to the children. This raises an ethical dilemma as the design team want to design for both groups but clearly are most concerned with making the products usable for children.

In our work (Read and Beale 2009) we have designed a security system (Possibilities not Perils) that is in two layers with one layer being the concern of the children and the other being the concern of the adults. Children are shown icons that identify when they are connected to other children and are clearly told where their data is heading. Adults on the other hand have adult style control systems that are shown to be robust and sturdy. It could be argued that it is the duty of a team making connected software for children to ‘educate’ children about the perils of being online and being in a shared data space. The view for our project is that this is not appropriate, the system needs to deal with the perils and the children need to feel free to use the software. Security, we feel, is a system problem that needs to be shown to adults but not to children. The only use of passwords for children, in the child-facing product, is for user profiles to be loaded that will give a better user experience.


Druin, A. (1999). Cooperative inquiry: Developing new technologies for children with children. CHI99, ACM Press.

Gefen, D., E. Karahanna, et al. (2003). “Trust and TAM in online shopping: An Integrated Model.” Management Information Systems Quarterley.

Mazzone, E., J. C. Read, et al. (2008). Design with and for disaffected teenagers. Nordichi 2008, Lund, Sweden ACM Press.

Read, J. C. and R. Beale (2009). Under my pillow: designing security for children’s special things. DCS – HCI 2009, Cambridge, UK, ACM Press.

Read, J. C., P. Gregory, et al. (2002). An Investigation of Participatory Design with Children – Informant, Balanced and Facilitated Design. Interaction Design and Children, Eindhoven, Shaker Publishing.

Scaife, M., Y. Rogers, et al. (1997). Designing For or Designing With? Informant Design for Interactive Learning Environments. CHI ’97, Atlanta, ACM Press.

Schuler, D. and A. Namioka, Eds. (1993). Participatory Design: Principles and Practices. Hillsdale, NJ, Lawrence Erlbaum.

Theng, Y. L., N. M. Nasir, et al. (2000). Children as Design Partners and Testers for a Children’s Digital Library. ECDL2000, Springer Verlag.

Ethics and Emerging Technologies: Practitioners’ Perspectives

Mary Prior and Simon Rogerson



In his famous 1985 paper James Moor proposed that the novelty of Computer Technology led to the existence of ‘policy vacuums’:

‘Computers provide us with new capabilities and these in turn give us new choices for action. Often, either no policies for conduct in these situations exist or existing policies seem inadequate.’ (Moor, 1985, p. 266)

To address this problem, a research project currently being undertaken within the European Commission 7th Framework Programme is focussed on identifying emerging Information and Communication Technologies (ICTs) and the ethical issues to which they may give rise, in order to recommend governance structures and policies aimed at addressing them before or as they arise (ETICA).

To complement the academic/research focus of ETICA, a project is being undertaken with ICT practitioners to identify their perceptions of emerging technologies, the ethical issues to which they may give rise and how they may be addressed. This paper will report the outcomes of this project, including a comparison between the perceptions of academics/researchers and of practitioners.

Research methods

The work is being undertaken on behalf of a professional body, with the co-operation of its more experienced members. Two research methods are being employed; firstly, a survey (questionnaire) (Bryman, 2008) and secondly, focus groups (Beardsworth & Bryman, 2006).

The ETICA project used a survey in its initial stages, aimed at researchers and helping to identify:

  • the fields within which current ICT research is being conducted;
  • application areas, expected use and the benefits of these technologies;
  • ethical, social and legal issues that were foreseen, how they were identified, how they had been addressed and how effective were any measures taken to address them;
  • the technologies likely to be used in the future, the ethical issues to which they might give rise and how they might best be addressed.

This survey was adapted for use with ICT practitioners. At the time of writing, responses have been received and analysed to identify fruitful areas for more in-depth discussion within the focus groups. The latter will comprise senior, experienced practitioners and will take place during the Spring of 2011.

Survey results

Respondents are working on a wide range of technologies, in a variety of industries. The ETICA project had identified 11 fields (e.g. affective/emotional computing; ambient intelligence; artificial intelligence; bioelectronics). Nearly half of respondents work in the field of ‘Cloud Computing’ with the next highest proportion being ‘Future Internet’. Altogether 9 of the 11 identified fields are represented.

Only half of respondents say that ‘possible ethical, social or legal problems’ were foreseen arising from the projects they were working on. Given the fields involved, and the expected benefits (many of which involved greater efficiency/cost savings and improved data management) this is an interesting finding that requires further investigation. The majority (nearly 80%) did not consider gender.

Of the possible ethical, social or legal problems identified, many were related to data protection, privacy and security. However others such as ‘reduced staff requirements’ and ‘intrusion into personal matters’ were also mentioned. Among the measures taken to address the issues, many were ‘technical’, although ‘including a work package on legal, ethical and social issues’, ‘reconsideration of the objective of the project’ and ‘setting up an ethics committee/review board’ were among steps taken, too. In one case, ‘cancellation of a part (or more) of the project’ was cited.

Among the future technologies identified, Cloud Computing figured prominently; hardly surprising given that many respondents were working in this field. Others mentioned were mobile technologies, with portable devices becoming more prevalent; internet-based applications and the integration of systems, for example more integrated household management and control. Asked whether they could identify ethical issues to which these are likely to give rise, respondents most frequently cited the security and privacy of data. In addition they mentioned:

  • the boundary between security/counter terrorism and civil liberties;
  • computer hackers will increase by use of the net;
  • retention of data on a server not owned by your organisation;
  • there is a tendency for organisations to assume there are no boundaries to what they can do; there is then an erosion of what is currently acceptable; this always seems to be for the benefit of the organisation and not of the individual;
  • the issues of replacing humans with machines;
  • how virtual reality is used to create situations in relation to gender, religion etc;
  • conflict of use when the same device is used for corporate as well as personal (i.e. private) computing.

Respondents were asked if they could suggest how any ethical issues arising from emerging ICTs should be addressed. A few replied simply replied, ‘no’. Others cited ‘personal responsibility’, the role of education and a technical approach via ‘tightly secured cloud computing’. Regulation was mentioned, as was the setting up of a committee similar to that used to consider ethical issues related to embryology in the UK. The development process was also mentioned, to include ‘multi-stakeholder dialogues’, formal risk assessments and ethics as an integral part. General public forums and focus groups were suggested by one respondent.

Further work

Half of the survey respondents have agreed to be contacted for more in-depth discussion of the issues raised. In particular, the researchers wish, firstly, to pursue the means by which ethical, social or legal issues have been identified and addressed in projects the study participants have worked on. Secondly, to explore the range of future technologies that have been identified and the potential ethical, social or legal issues to which they may give rise. Finally, to discuss the means by which participants suggest these issues should be addressed.

Having summarised the findings from this study with experienced industry practitioners, the paper will compare them with the findings from the more academic/research-oriented participants in the ETICA project. Concluding observations will include suggestions/recommendations for further work in this area.


Beardsworth, Alan & Bryman, Alan (2006), Focus Group Research. Open University Press.

Bryman, A. 2008. Social Research Methods. 3rd ed. Oxford University Press.

ETICA Project home page:

Moor, James (1985), What is Computer Ethics? Metaphilosophy, vol. 16 no. 4, 266-75.