top of page

Not that Selfish after all...


The homo economicus is selfish. He does not give to others. But how about the actual human being? Are we that selfish? My dear friend Bruce Rigal, banker turned behavioural economist, does not think so.


Maimonides, the medieval Jewish philosopher considered eight levels of charity, each greater than the next. In the lowest levels, we give unwillingly, inadequately or only after being asked. Then comes charity given directly into the hand of the recipient so neither giver nor receiver are anonymous. A higher level occurs when we do not know to whom we give but they know our identity. Higher still is when we know to whom we are giving, but they don’t know who we are - a choice on the most popular charity giving websites. The second highest level is where both giver and receiver remain anonymous. The highest level involves helping someone earn for themselves by lending them money, setting them up in employment or partnering with them in business.

Maimonides certainly provides noble guidance to our fickle selves.


Do we give to others willingly and without receiving something in return? Standard economic theory predicts that individuals will not volunteer to give something of value away without receiving a favour in return. This theory was first expressed by von Neumann and Morgenstern (1944); they saw people as utility or value maximisers. Almost all 20th century theory and government policy, in the West at least, has been based on this presumption.

Later in the 20th century behavioural and experimental economists used social preference games in the laboratory to see whether people cooperate or compete; whether they share their wealth or act selfishly.

The most famous of these games is probably the prisoner’s dilemma, dramatically portrayed in “A Beautiful Mind,” a film about John Nash. Nash was a brilliant mathematician and a rationalist. He predicted that prisoners being questioned in two separate police cells would defect on their partner-in-crime in the other cell to maximise their own position rather than cooperate and keep quiet, so both would get a lesser sentence. This Nash Equilibrium sounds sensible in theory – but in laboratories people do often cooperate in prisoner’s dilemma situations, showing that we do care about more than just saving ourselves.

Another great yet simple game for testing social preferences is the dictator game. Two players participate, one player is randomly assigned to be the dictator and the other the receiver. The dictator is endowed (given) a sum of money, say £10. The dictator is then asked whether they want to give any of their new-found wealth to the other, passive player. There is no coercion; the dictator can choose to give some, all or none of their endowment to the other player. So, what, in fact, do they actually do?

In a summary of studies in 2011, Christoph Engel found that 36% give nothing, as predicted by standard economic theory. But a full 17% give half. Very few give more than half. On average people give 28%. We tend to cut our cake into four equal pieces and give one of these pieces to someone else, keeping the remaining three pieces for ourselves.

What determines the level of giving? Demographics matter. Engel found that women and older people give more, while students seem to give less. Surprisingly, level of education and employment status did not affect the level of giving.


So how much would you give? Well, you might say ‘it depends’ and that’s true in the laboratory where context, as Merle has pointed out, really matters. Maimonides knew that anonymity decreased giving significantly. It seems that contributions are a function of social concerns about what others think of us, as well as our desire for fairness.

What about risk? Most of these games test choices between certain values. Participants with an endowment, are asked how much they want to give. The outcome - how much people are willing to share - is known with certainty once decisions are made. Real-life decisions we make about giving to, and taking risk on behalf of, others are not nearly so deterministic and involve wide probability distributions of outcomes - think of those who rushed to help in the Grenfell Tower fire in London last year.


In a dictator experiment ran at the University of Warwick last year participants gave 26% of their endowments in a certain situation. Giving fell significantly to 20% when risk was added to this situation. We plan to run similar experiments to see if people will actually choose to have less information and more risk before choosing to give. We think participants will choose to be ignorant of their own endowment and the plight of the other. This is called “creating moral wiggle room.” By knowing less, we can rationalise not giving. An example is not engaging with the beggar on the street to hear their plight and by not checking how much change we have in our pockets. We can then retain false views on whether the other deserves their poverty, and our own ability to give.


900 years ago, Maimonides already knew what we are like and didn’t need social preference experiments to confirm his beliefs. We are not always uncaring and selfish or economically rational. However, context matters, and we need coaxing to do better, particularly when we don’t know the receivers of our good deeds and they don’t know us. The hardest step is to increase our own risk to reduce the risk of another, say by taking them into our homes, hiring them, lending them money to start a business or going into partnership with them.

Maimonides certainly had it right.


References Dana, J., Weber, R.A. & Xi Kuang, J. (2007). Exploiting moral wiggle room: experiments demonstrating an illusory preference for fairness. Economic Theory, 33: 67–80.

Engel, C. (2011). Dictator games: A Meta Study. Experimental Economics, 14(4), 583-610.

Von Neumann, J. & Morgenstern, 0. (1944). Theory of games and economic behaviour, 41-67.

Behavioural Science

Personal Finance

Interviews

PhD

bottom of page