Research Overview. Social Sciences. Technology

Trust, control and confidence are the key elements of the book, with trust being the most fundamental one (and the hardest to define). Even though the phenomenon of confidence is interdisciplinary, research in confidence (trust and control) tends to follow the recognised structure of disciplines. The following is a short and non-exhaustive overview of different works related to trust and confidence, structured into two main groups.

The first covers the widely understood social sciences (including philosophy, psychology, economy and management) while the second covers the technical approach to confidence, mostly within the context of computer science. Each group presents a varied set of approaches to confidence. The selection has been made on the basis of their importance for the discussion presented here and is not intended to provide complete coverage of the field.

Social Sciences. The ethics of trust and control has been a fundamental question for a long time. However, there is a general disagreement whether trust is morally right or wrong, following the disagreement about the foundations of human nature. Proponents of trust applaud it as a social virtue and point that without exercising trust, trust will never be reciprocated (e.g. Aristotle, Hume, Kant). Usually, they suggest reforming society by exercising individual virtues in order to maximise trust and minimise control.

Opponents tend to think of trust as a vice and demonstrate the irrationality of trusting in light of visible high levels of betrayal (e.g. Plato, Machiavelli, Hobbes). This usually goes together with a call for increased control and a strong state as they see the certain opportunity for everyday trust being guaranteed by strong institutions. Whatever the proposition, the importance of trust (and to a certain extent the importance of control) to human life is clearly obvious.

The importance of trust and control as a complexity reductor [Luhmann1979] created the opportunity to discuss trust and control not from a moral, but from a rational position, treating them both as forms of social enablers. Specifically trust, through its low internal complexity and significant power to reduce complexity, is able to overcome our limitations of bounded rationality [Simon1957] and reduce the otherwise unbearable complexity of all foreseeable futures.

Therefore, trust can be discussed as a psychological and social necessity that can be rationally analysed, weighted and explained. The fact that trust is present in all forms of communication [Habermas1986] reinforces the perception of trust as a fundamental social enabler.

Trust developed in early childhood is believed to be the foundation of a general disposition to trust, but even more importantly, such trust supports our notion of normality and lowers our existential anxiety [Giddens1988]. Interestingly, this rudimental form of trust seems to develop in parallel with the understanding of self-identity, thus creating at the same time the notion of self versus others and the notion of trust in specific others (usually caregivers), within the basic understanding of limitations of time and space.

The observation that there is a link between the level of social trust, welfare and development (e.g. [Fukuyama1996], but see also [Dasgupta2000] and [Luhmann1979]) has created significant interest in considering trust as a form of social capital and has led research into a closer understanding of the process of creation and distribution of such capital. Even though the original concept of 'high trust' and 'low trust' societies may not necessarily hold, it has been widely accepted that trust (or, to be more specific, trustworthiness that leads to a vested trust) decreases transactional cost, thus leading to greater efficiency in economical relationships.

Even though it is not clear whether it is the network effect that facilitates trust or trust that allows the network of social relationships to develop [Hardin2002], the social benefit of both is clearly obvious.

Confidence can be interpreted as an unidirectional relationship (Alice is confident about Bob), but it can be also the two-directional one, where the mutual relationship between Alice and Bob evolves over time. Deutsch [Deutsch1973] differentiates between interpersonal trust (where one party is not aware that the other party trusts it) and mutual trust (where both parties are involved in the mutual relationship of trust).

As the concept of reciprocity (that is available mostly in relationships) may significantly alter the dynamics of confidence building, the process of relationship was widely studied. The dynamics of relationship [Lewicki1996] suggests that the relationship develops through a three-stage process, starting with the control-driven stage and hopefully ending with the trust-based one.

Organisations, the scope of management sciences, have usually been associated with hierarchical control, driven by specialisation and efficiency. More recently, however, there has been a visible trend to discuss trust as an important element of organisation (e.g. [Solomon2001]), both internally (e.g. as a facilitator of innovations [Afuah2003] and as a supporter of organisational changes in times of internal conflicts [Webb1996]) and externally (e.g. in the virtual organisation [Handy1995]).

The growth of outsourcing has raised questions of trust between cooperating parties, specifically in the context of different cultures (e.g. [Sako1992]). Similarly, the need for increased agility has created interest in the rapid creation of trust in temporary groups [Meyerson1996]. There is an underlying assumption that trust and confidence are attributable to organisations in a same way as to individuals, following the concept of intentional stance [Dennett1989].

Game theories use the concept of trust to explain phenomena that counter the instant economic rationality of utility maximisation. The ability of economic players to go beyond obvious self-interest, (potentially in expectation that the other party will reciprocate) became the basis of several economic games. The game of trust [Berg1995], allows trust and trustworthiness to be expressed in monetary terms, thus becoming a model solution in situations where trust should be measured.

More recently, the game of distrust [Bohnet2005] has been used to explain some significant recent management disasters. Similarly, studies of the Prisoner's Dilemma (formalised by W. Tucker in [Poundstone1992]) are used to link trust with economic utility and demonstrate the rationality behind reciprocity.

Technology. Within the realm of technology, trust and control have usually been associated with reliability (e.g. [Powell2003]) and were not seen as a separate issue until the arrival of complex computer-controlled systems. Computer science had initially approached trust and control from the perspective of security. Recognising that trust is not controllable, the security developed an elaborate structure of control (e.g. [Bishop2005]), in an attempt to minimise elements of trust (e.g. [Anderson2001]).

However, more recently, the recognition of the fundamental nature of trust has been addressed in initiatives such as trusted computing [Pearson2002], where individual devices are given assurance in their own configuration on the basis of a highly-protected, hardware-based root of trust. The need for a portable root of trust has also fuelled the creation and popularity of smart cards [Rankl2003].

In data communication, the understanding that trust precedes meaningful (and secure) communication has eventually led to the concept of trust management, the separate layer of interactions that lead to the creation (and maintenance) of trust relationships between communicating nodes, following e.g. business agreements, contractual dependence, personal relationship, etc. PGP [Zimmermann1994] has been exploring the area of peer-to-peer trust while PKI ([Adams2002] or [Perlman1999] for alternative models of trust enabled by PKI) proposed the multi-stage model of trust.

More recently, WS-Trust [Anderson2005] has established itself as a standard within service-oriented architecture (SOA), the potential foundation of Web 2.0 while [Ishaya2004] offers trust management for virtual organisations. Grid computing (e.g. [Dragovic2003]) and pervasive computing environment (e.g. [LoPresti2005]) have brought different challenges to trust management.

The need to effectively manage distributed computing systems has led to constructs such as trusted domains (several computers trusting each other's authentication capabilities), trusted credentials (others' identities accepted without any further proof), trusted storage (storage space accessible only to selected users), trusted zones (privileged Internet address space) etc.

In all these cases there is a notion of trust as essential yet different from actual cooperation (or communication), something that requires special management practices. Usually, the ability to manage trust is granted to system administrators or users, in the expectation that the technical structure of trust will reflect trust in respective social relationships.

Research on autonomous agents (e.g. [Falcone2006]) has liberated trust management from the need for an a priori trust, managed by the user or the administrator. Agents were vested with the ability to make and break the trust relationship (that can be more correctly called 'the relationship of confidence'), usually on the basis of past experience, through the process of learning, whether from direct interactions or from others' experience.

Autonomous agents have brought the notion of imperfect trust (where trust is no longer a binary proposition), the problem of trust propagation [Josang2006] and reasoning. The new approach to trust has also - unfortunately - revealed new threats to trust, usually in the form of attacks on reputation [Dellarocas2004].

Interest in large systems (whether created by autonomous agents, ad-hoc networks or in any other way) required more specific instruments to discuss the reasoning about trust. To name a few, Josang [Josang2001] proposed the algebra of uncertain probabilities, introducing the notion of uncertainty to the probabilistic distribution of outcome. Formalisation of trust (e.g. [Demolombe2004], [Marx2001], [Huang2006], [Herrmann2006]) proposes logical primitives and schemes that can be used in reasoning about trust.

The formalisation of reasoning has led to the creation of several formal systems and supporting tools. Marsh's [Marsh1994a] formal model of trust brings the concept of trust closer to the domain of computation while Grandisons's Sultan [Grandison2003] allows the capture, simulation and reasoning about trust-based relationships.

Both reasoning and transitivity require trust (confidence) to be qualified. The desire to measure trust (and confidence) generated significant amount of research. There are several works that link trust with probability either directly (where trust itself is perceived as probability - e.g. subjective probabilities [Gambetta2000]), or through models related to economics, e.g. Barber's [Barber1983] model of probability or Hardin's [Hardin2002] Bayesian model associated with economic payoff. Almost every model of trust introduced a different range of values that can be assigned to trust (see e.g. [Abdul-Rahman2005] for a review), sometimes with conflicting semantics.

From a more application-specific perspective, electronic commerce has used various metrics of trust to develop risk assessment, both for the seller and for the buyer. This has become an important focal point of several works [Rutter2001], [Kracher2005]. The commercial value of eBay's reputation system [Resnick2006] is widely known, and similar rating systems [Dellarocas2004] are used by other e-commerce sites.

Collaborative filtering [O'Donovan2005] has been used to aid information search (following the concept that trust is a qualified reliance on information [Gerck2002]), but as more automated systems moved into the area [Page1998], collaborative filtering became the preferred solution for the recommendation. The needs of electronic commerce have stimulated the interdisciplinary approach to trust [McKnight2001], [Tan2000].

Another effect of the introduction of electronically-mediated communication is the development of research in user trust in digital devices, e.g. in a form of web features that facilitate the creation of perceived trust [Egger2000], trust in information systems [Li2004] or in improvements of trust between people while communicating through a digital channel.

 






Date added: 2023-09-23; views: 204;


Studedu.org - Studedu - 2022-2024 year. The material is provided for informational and educational purposes. | Privacy Policy
Page generation: 0.017 sec.