Brain-Computer Interfaces: a technical approach to supporting privacy

AUTHOR
Kirsten Wahlstrom, Ben Fairweather and Helen Ashman

ABSTRACT

Introduction

Brain-Computer Interfaces (BCIs) facilitate communication between a brain and a computer and can be categorised according to function: interpretation of neural activity, stimulation of neural activity, and interpretation-stimulation. Warwick’s self-experiments with implants in the interpretation-stimulation category (Warwick and Gasson, 2004) demonstrate the technical feasibility of extending the human nervous system beyond its biological limits to other systems, and to other people, via the internet. Furthermore, there have been recent advances in interpreting passive neural activity (Coffey et al., 2010) and also in concurrent interpretation of visual and motor intentional neural activity (Allison et al., 2010). A future BCI (fBCI) integrating these technical features would concurrently interpret both intentional and passive neural activity in order to communicate information to systems and other people via the Internet.

In addition to these research advances, BCIs interpreting intentional neural activity via electroencephalography (EEG) are available to consumers (Emotiv Systems, Intendix). Should fBCI be commercially viable, an ethical and legal obligation to support privacy will exist.

Privacy emerges from a society’s communication practices (Westin, 2003). Although acculturation plays a role in shaping privacy expectations, the extent to which one person requires privacy may differ to that required by another (Gavison, 1980). In addition, a person’s perception of privacy is dependent on context (Solove, 2006). For example, I am likely to openly disclose details related to my health to my father, judiciously disclose these details to my friends, and withhold them completely from strangers. Thus, privacy requirements are diverse and susceptible to change.

Privacy is a component of freedom, autonomy and identity. When using technologies, people assert independence and autonomy by declining to participate, or by using anonymity or misinformation to create and maintain privacy (Lenhart and Madden, 2007, Fuster, 2009). When people opt out, adopt anonymity or engage in misinformation, the effectiveness of any technology reliant upon accurate and representative data is compromised.

The conceptualisation of privacy as culturally shaped and unique for each person and their immediate context is well understood, long-standing, and widely applied by law- and policy-makers. It forms the basis for legislative and other regulatory approaches such as the Australian Privacy Act, the EU’s Privacy Directives and the OECD’s Guidelines. These legal obligations, and further ethical obligations (Burkert, 1997), mandate support for privacy with respect to technologies. In addition, if technologies support privacy, people are more likely to provide accurate information, adding value to the technology itself. However, to the authors’ knowledge, there have been no investigations of technical approaches to supporting privacy in BCIs. This paper presents a conceptual model for consideration and critique.

BCI technology

BCIs identify and measure the electrical activity associated with activating specific neural pathways (Berger et al., 2007). These measurements are then applied to the control of external systems (Hochberg et al., 2006). With respect to BCI technologies, the identification and measurement of neural activity has been achieved with surgically invasive and non-invasive approaches. While surgical BCIs identify and measure neural activity with a higher level of accuracy, non-surgical approaches carry fewer health risks. Thus, there has been interest in improving the accuracy of non-surgical BCIs (Allison et al., 2010).

BCIs have neural networks which must be trained to identify a person’s neural activities and then to map specific neural activities to specific intentions. For example, consider a scenario in which Ann has purchased a new BCI to use with her mobile phone. She must spend time training the BCI to recognise the unique pattern of neural activity that matches with imagining each person in her mobile phone’s address book and to recognise neural activity corresponding to the ‘call’ and ‘hang up’ commands.

Conceptual model

If BCIs can identify and measure neural activity, then they can also identify and measure a person’s privacy perception and requirement. The person’s privacy requirement can then be applied to any information they may be sharing. For example, consider a scenario in which Bob is using a BCI to interact with his mobile phone. He is calling Charlie but does not want the call logged in the mobile phone’s memory. First, he thinks of Charlie and the mobile phone retrieves Charlie’s number. Then Bob thinks of not logging the call and the mobile phone saves this privacy requirement in its working memory. Finally, Bob thinks ‘call’ and the mobile phone places the call without logging it.

This scenario is a binary situation: log the call or don’t log the call. However, privacy requirements are much more diverse than this. If this conceptual model can be refined to support a diversity of privacy requirements, a technical prototype will be designed, implemented and tested.

The full paper will further conceptualise privacy with a view to informing a future prototype. Then the paper will describe the technologies underlying BCIs. These conceptual and technical descriptions will enable the proposition of technical conceptual model for the prototype which offers flexibility with respect to privacy and interoperability with respect to existing BCIs. Conclusions will stimulate consideration, discussion and critique.

REFERENCES

ALLISON, B. Z., BRUNNER, C., KAISER, V., MULLER-PUTZ, G. R., NEUPER, C. & PFURTSCHELLER, G. 2010. Toward a hybrid brain-computer interface based on imagined movement and visual attention. Journal of Neural Engineering, 7, 026007.

BERGER, T., CHAPIN, J., GERHARDT, G., MCFARLAND, D., PRINCIPE, J., SOUSSOU, W., TAYLOR, D. & TRESCO, P. 2007. International Assessment of Research and Development in Brain-Computer Interfaces.

BURKERT, H. 1997. Privacy-Enhancing Technologies: typology, critique, vision. Technology and privacy: the new landscape.

CAMPBELL, A., CHOUDHURY, T., HU, S., LU, H., MUKERJEE, M., RABBI, M. & RAIZADA, R. NeuroPhone: Brain-Mobile Phone Interface using a Wireless EEG Headset. MobiHeld 2010, 2010. ACM.

COFFEY, E., BROUWER, A.-M., WILSCHUT, E. & VAN ERP, J. 2010. Brain-machine interfaces in space: Using spontaneous rather than intentionally generated brain signals. Acta Astronautica, 67, 1-11.

DRUMMOND, K. 2009. Pentagon Preps Soldier Telepathy Push. Wired.

EMOTIV SYSTEMS. Emotiv – Brain Computer Interface Technology [Online]. Available: http://emotiv.com.

FUSTER, G. 2009. Inaccuracy as a privacy-enhancing tool. Ethics and Information Technology.

GAVISON, R. 1980. Privacy and the Limits of Law. The Yale Law Journal, 89, 421-471.

HOCHBERG, L., SERRUYA, M., FRIEHS, G., MUKAND, J., SALEH, M., CAPLAN, A., BRANNER, A., CHEN, D., PENN, R. & DONOGHUE, J. 2006. Neuronal ensemble control of prosthetic devices by a human with tetraplegia. Nature, 442, 164-171.

INTENDIX Personal EEG-based Spelling System.

LENHART, A. & MADDEN, M. 2007. Teens, Privacy and Online Social Networks: How teens manage their online identities and personal information in the age of MySpace.

SHACHTMAN, N. 2008. Army Funds ‘Synthetic Telepathy’ Research. Wired.

SOLOVE, D. 2006. A Taxonomy of Privacy. University of Pennsylvania Law Review, 154, 477-560.

WARWICK, K. & GASSON, M. Extending the human nervous system through internet implants – experimentation and impact. IEEE International Conference on Systems, Man and Cybernetics, 2004 The Hague, Netherlands. 2046-2052.

WESTIN, A. 2003. Social and Political Dimensions of Privacy. Journal of Social Issues, 59, 431-453.