Trust is one of the key social concepts “that helps human agents to cope with their social environment and is present in all human interactions” (Gambetta, 1990). Without trust in other agents, infrastructures and organizations, human interactions are weakened and, ulti-mately, almost no social cooperation would be possible. Trust is fundamental to deal with different factors or situations all displaying a lack of certainty: a) with social environment (i.e., to rely upon incomplete information and beliefs in order to take decisions); b) with someone else behaviours (i.e., to rely upon never fully predictable other’s actions in order to obtain a goal); c) with organizations or infrastructures (i.e., to rely upon organizations or infrastructures whereas their functioning is not entirely visible or known). Trust is meant to exist insofar as there is a risk (Luhmann, 1979) or at least something beyond control.
Trust is not only concerned with establishing human relations but also plays a paramount role in the networked society of information. The importance of adopting networked co-operation, as to enhance creation and distribution of informational resources, has been re-cently stressed from a wide socio-political and economic perspective (Benkler, 2006). All the major aspects of multi-agent systems require trust for the success of operations, nego-tiations, and relations based on computer mediated-interaction or coordination between individuals or groups. In my paper I focus, thus, on the question of trust-building within web technologies based-environments, according to what is suggested by the topic of the Ethicomp meeting – living, working, and learning beyond technology.
Building trust is not only a question of assuring technological security (by means of rules, constraints, protocols, architectures, controls, guaranties, etc.). On the contrary it is mainly concerned with mental and social dispositions towards other agents, which has to be con-sidered within the framework of a specific model of online-social trust. This requires a tri-ple shift of paradigm in the study of trust: a) from technologically gained security to per-ceived security (i.e., security as it is perceived by agents disposing of incomplete informa-tion and predictions); from control trust (i.e., trust based on control mechanisms for as-sessing and establishing trustworthiness) to party trust (i.e., trust based on the dynamic so-cial interaction between a party – the trustier – and a counter party – the trustee); from a model of probabilistic trust (i.e., a model based on rigid methods of statistical inference) to a model of cognitive social trust (i.e., a model based on beliefs, expectations and concerns).
In the first part of the paper I analyse why this triple shift of paradigm is required by the specific evolution of the networked society of information and how conducive is the appli-cation of the cognitive social model of trust (mainly elaborated by Castelfranchi and Fal-cone) in order to account for the dynamics of online social cooperation. Building such a cooperation upon trust demands to clarify both the role of beliefs, mental representations and expectations of the trustier (which are modelled according to a degree of trustworthi-ness) and the status of the act of trusting itself (that is tied up with a sharp decision). In this regard, it is crucial the lack of certainty which trust lies on: nor we can speak of an absolute absence of information (since some elements are necessary to mould and evaluate trust-worthiness, let aside the extreme case of blind trust) neither of complete and certain infor-mation (which would exclude the dimension of risk inherent in the act of trusting, namely of relying upon what is beyond control).
In the second part of the paper the heuristic virtues of the socio-cognitive model of trust are to be tested in relation to a specific topic, that is to say the online social cooperation re-quired by the production of common goods within the theoretical framework provided by the economy of networked information. In particular, the model of trust is to be applied as a vi-able explanation for the success of social cooperation while creating and sharing informa-tion and knowledge. More specifically, the aim is to show why technological architecture does not suffice by itself to assure the non-market production of informational shared re-sources. In order to explain this result, the mental and social dispositions towards trustwor-thy forms of networked cooperation are to be taken into account.
The third and last section of the paper aims to draw some conclusions from the previous analyses in order to figure out how social trust can be part of the definition of rationality or at least of a rational system of decisions and behaviours in the sphere of networked society. In this perspective, we adopt a concept of practical reason suggested and put forward by the work of Friedrich Hayek, who remarkably noticed that “surely, one of the tasks of rea-son is to decide how far is to extend its control or how far it ought to rely on other forces which it cannot wholly control” (Hayek, 1982). An ordered and efficient multi-agent sys-tem is not based on a rigid organization of cooperation (by means of social control, author-ity of a third party, technological devices for security, protocols, rules or other constraints, etc.) but on its own capacity to promote online social cooperation by building a web of trust and a trust atmosphere (Castelfranchi, 2004). The inner practical rationality of the multi-agent system of the networked society of information is not to be found in a set of certain-ties (from which the social order should be deduced) but in its attitude to cope with the structural lack of knowledge characterizing the asymmetry between environmental and sys-temic information.
Benkler Y., The Wealth of Networks: How Social Production Transforms Markets and Freedom, Yale University Press, New Haven CT, 2006.
Castelfranchi C., “Trust Mediation in Knowledge Management and Sharing”, in C. Jensen, S. Poslad & T. Dimitrakos (eds.), Trust Management, Second International Conference, iTrust 2004, Oxford, March 29 – April 1 2004, Proceedings Series: Lecture Notes in Com-puter Science, vol. 2995, 2004.
Castelfranchi C. & Falcone R., Social Trust: Cognitive Anatomy, Social Importance, Quantification and Dynamics, Proceedings of the first Workshop on Deception, Fraud and Trust in Agent Societies, Minneapolis/St. Paul, pp. 35-49, 1998.
Gambetta D., Trust, Basil Blackwell, Oxford, 1990.
Hayek F., Law, Legislation and Liberty, Routledge & Kegan, London, 1982.
Luhmann N., “Trust: a mechanism for the reduction of social complexity”, in Id. Trust and Power: Two Works by Niklas Luhmann, John Wiley & Sons, New York, pp. 1-103, 1979.