Design for Privacy: Towards a Methodological Approach to Trustworthy Design

AUTHOR
L. Jean Camp, Kalpana Shankar and Kay Connelly

ABSTRACT

The paper discusses privacy in ubicomp as a design, social, technical, and policy issue; outlines research challenges presented by the technical and social dimensions of using sensor networks as a monitoring technology, and describes the beginning and need for a methodology for designing for privacy in ubicomp.

I. Introduction & Overview

Ubiquitous and pervasive computing, also known as ubicomp, will result in large-scale transformational change as our environment becomes aware, active and responsive. Through the distribution of sensors and tags such as RFID, ubicomp environments become active as sensor data is processed, examined, then triggers response in the environment. Ubicomp has the potential to foster ubiquitous surveillance, or extend human autonomy. Meeting the potential, while minimizing troubling effects, requires that ubicomp be built with an appreciation of values and in particular the legal, social, cultural, and personal value that is privacy. This double-edged sword is of particular importance in the health care environment, where ubicomp has significant potential.

In this paper, we discuss the theoretical deciding in the design stage who will have access to and control of information can enhance functionality while protecting privacy.

Privacy has been addressed too simplistically in technical design because the designs start with a definition of privacy, instead of deriving a definition from the needs, preferences, and understandings of system users and embedding it in the technical design. More complex definitions of user-centered privacy incorporate computer security and various rights.

Computer security and privacy are closely linked, and in most cases, well aligned. Security is the control of information. Privacy is the control of information by the subject. Security can provide tools for anonymizing data, so that privacy is not a concern. Thus authentication is a foundational pillar of this work because of its ability to enhance personal privacy.

Yet security technologies also enable authorization, identification and verification of identity. When subjects can be identified, the inability of that subject to control information linked with their identities is a threat to privacy.

Privacy can also be a right to seclusion — “the right to be let alone”. [1,2] Privacy as seclusion is the underlying theory of privacy tort rights. Seclusion is violated by constant video ubicomp potentially even when there is anonymity. Intimacy and sphere of privacy does not require anonymity but would be served by characterization as described in the previous sections of the proposal. In such and methodological framework for the value-sensitive design and evaluation of privacy-enhancing technologies for informal caregivers to monitor and interact with chronically ill or aging family. In Section 2, we discuss the competing visions of privacy policy and privacy as a design principle. In section 3, we discuss our motivations for selecting our application domain, home health care, and the technical and social dimensions of ubicomp in that context. We close by arguing that the design for values framework offers great promise for privacy in healthcare.

II. Theoretical Foundation: Privacy as a Design Value

On the surface, there is a seemingly inherent tradeoff between ubicomp and privacy. Yet, privacy-enhancing ubicomp is not an oxymoron. Privacy and ubiquitous computing can, together, serve to enhance individual autonomy. Of course there can be a conflict between the designer’s desire to have information to make optimal use of the system and the subjects right to privacy, that is, their control of information about themselves. Yet carefully selecting information and a case, accuracy of depiction would be the primary issue. Consider, for example, ubicomp in a bathroom. Accurate depictions would violate privacy in the sense of seclusion even without the association of a unique name to any user. Consider, for example, ubicomp in a bathroom. Accurate depictions would violate privacy in the sense of seclusion even without the association of a unique name to any user. Yet such ubicomp may be necessary not only for the widely-hyped war on terror to prevent bombings in crowded public facilities but also for the simple reason that the bathroom is a most common location for injuries from falls for the elderly.

Privacy is also a property right [3]. Control over privacy can yield economic advantage. [4]. Video ubicomp that provides demographic information and enables price discrimination can violate this dimension of privacy.

Understanding how these meanings can inform design is fundamental to designing for privacy. In previous work Prof. Camp has offered and tested a set of mechanisms for the design of privacy enhancing technologies for Internet commerce based on data availability. [5]. Yet examination of designs in the Internet wired context is far ore simple than in the ubicomp context. In order to illustrate the complexities consider the case of home health care.

REFERENCES

[1] Prosser, W.L. Handbook of the Law of Torts, West Publishing Company, St, Paul, MN. , 1941.

[2] Bloustein, E.., “Privacy as an aspect of human dignity: an answer to Dean Prosser,” New York University Law Review, Vol. 39, 1968, pp. 962-970.

[3] Mell, P. “Seeking shade in a land of perpetual sunlight: privacy as property in the electronic wilderness”, Berkeley Technology Law Journal, 1996 (http://www.law.berkeley.edu/journals/btlj/index.html.)

[4] Odlyzko A., “Privacy and price discrimination” in L.J. Camp & S. Lewis (eds), The Economics of Information Security, Springer/Kluwer, Waltham, MA, 2004.

[5] Camp, L.J. Trust and Risk in Electronic Commerce, MIT Press, Cambridge, MA, 2001.