Information Technology Reference
In-Depth Information
2 Related Work
In their evolutionary history, humans have developed several mechanisms for the
emergence and establishment of social norms [2]. Punishment and reputation are
among the most widespread and effective mechanisms to sustain cooperation and
they are specially interesting for virtual societies in which the e cacy of enforc-
ing mechanisms is limited by a combination of factors (like their massive size,
their spontaneity of creation and destruction, and dynamics). There is a large
body of evidence showing that humans are willing to punish non-cooperators,
even when this implies a reduction in their payoffs [10], and this is true also in
simulation settings. Villatoro et al. [27] have analyzed the effect of sanctioning
on the emergence of the norm of cooperation, showing that a monetary pun-
ishment accompanied with a norm elicitation, that is, sanctioning, allowed the
system to reach higher cooperation levels at lower costs, when compared with
other punishment strategies.
Reputation is, along with punishment, the other strategy used to support
cooperation, even if it works in a completely different way. If punishing means
paying a cost in order to make the other pay an even higher cost for his de-
fection, reputation implies that the information about agents' past behavior
becomes known, and this allows agents to avoid ill-reputed individuals. In Ax-
elrod's words [1]: “Knowing people's reputation allows you to know something
about what strategy they use even before you have to make your first choice”
(p.151). The importance of reputation for promoting and sustaining social con-
trol is uncontroversial and it has been demonstrated both in lab experiments [23]
and in simulation settings, in which reputation has proven to be a cheap and
effective means to avoid cheaters and increase cooperators' payoffs [21].
When partner selection is available, reputation becomes essential for discrim-
inating between good and bad partners, and then to be protected against ex-
ploitation. Giardini and Conte [13] presented ethnographic data from different
traditional societies along with simulation data, showing how reputation spread-
ing evolved as a solution to the problem of adaptation posed by social control,
and highlighting the importance of gossip as a means to reduce the costs of
cheaters' identification. The effect of partner selection has been studied also by
Perrau and others [6], who have analyzed the effect of ostracism in virtual soci-
eties, obtaining high levels of tolerance against free-riders. However, in the work
of Perrau, agents do not explicitly transmit information about other agents, they
only reason about the interactions.
In the multi-agent field, several attempts have been made to model and use
reputation, especially in two sub-fields of information technologies, i.e., com-
puterized interaction (with a special reference to electronic marketplaces), and
agent-mediated interaction (for a review, see [18]). Models of reputation for
multi-agent systems applications [29,24,16] clearly show the positive effects of
reputation, and there are also interesting cases in which trust is paired with
reputation (for a couple of exhaustive reviews, see [22,26]).
More specifically, Sabater and colleagues [25,5] developed a computational sys-
tem called REPAGE in which different kinds of reputational information were
Search WWH ::




Custom Search