ABSTRACT

The analysis of trust and reciprocity in behavioral economics starts from the insight that people do not generally like to see unequal outcomes. People do not like to be treated unfairly, and they do not like to see others being treated unfairly either. If we feel that we are being treated unfairly then we are less likely to trust and reciprocate. The interplay between trust and reciprocity is a key element in many of the cooperative and collaborative activities that we undertake every day, from collaborative teamwork (e.g., HART, DAO, and the aforementioned process of skills transfer between humans and robots/humans in the context of the Tactile Internet) to the altruism we show in donating or charity. In this chapter, we investigate methods to improve trust and reciprocity between agents. Toward this end, we focus on the widely studied trust game of behavioral economics in a blockchain context. Further, we present a networked version of the trust game, involving a third party called observers, whose primary goal was to discourage selfish behavior. Blockchain smart contracts and oracles were employed in the context of both classical and networked trust games. We demonstrated that the way in which our economic decisions are determined and reinforced can be realized by a range of factors—apart from money—including deposit mechanism as a precommitment and self-control of agents and the presence of social influences such as peer pressure by observers and, more interestingly, persuasive social robots in the emerging field of robonomics.