A new study indicates that human generosity may have a limit, even when being generous would fulfill our selfish tendencies too. In an economic game where investing all available resources in cooperation with others gave the largest possible reward, players would still hold back from contributing everything they had, and continued to view their collaborators as potential competitors. Our ability to cooperate appears to have definite limits, just as our selfishness does.
Many scientists have suggested that humans cooperate as well as we do because of natural selection. When a person could only do so much sabre-tooth-tiger killing or berry-gathering in a day, cooperating and exchanging goods gave groups of humans a competitive edge over other species. Observing this tendency in studies lead some researchers to conclude that we have a predilection for engaging in as much cooperation as we can get back in return. The new work suggests that those study designs may have been flawed.
The games that scientists use to identify the limits of selfishness usually work by asking a group of people to contribute resources to a public project. Their contributions are added together, increased slightly, then divided equally among them again.
Most people will contribute initially, even though they would do best to hold onto their money and feed off the contributions of others. Usually, participants settle into a declining rate of contribution, with less than 10 percent of people contributing after a few rounds.
The new study, published in PNAS, asserts that this style of study has a couple of problems. One is that, by rewarding cooperation, selfish and generous tendencies butt up against each other, making it hard to isolate the effects of either one. Another is that any mistakes or errors by the participants can have lasting effects over multiple rounds.
For example, if a player mistakenly enters "100" as a contribution when he meant "10," a much larger amount gets distributed among the group, making others inclined to contribute. This can unrealistically extend the portion of the game where participants remain generous.
To correct this, researchers made some alterations to the game so there were fewer pulls and pushes between selfishness and generosity. In the new version, subjects received at least 1.25 times the contribution they made to the public project, with nothing riding on what others in the group did. The optimal approach for any player would be to contribute all of their money every time.
The rates of contribution did go up from the usual economic game, but curiously remained stuck at around 60 percent of the available resources. Furthermore, 92 percent of players said they did not perceive their fellow group members as full collaborators, indicating they still felt competitive towards them.
So researchers tried a couple other variants. They added rewards for the group that collectively put the most money towards the public project. They also tried situations where participants lost whatever money they did not contribute, and varied whether the people were informed of the rules beforehand.
Even when they were explicitly informed that they would lose any money they didn't contribute, and would gain a larger reward if they worked together with their group, players would still regularly contribute less than 100 percent of their resources. And it wasn't even real money.
While the contributions remained high, indicating that participants understood the way the game worked, they would still hold out some resources. This doesn't disprove our prosocial tendencies at all, but indicates that our generosity and desire to work together is limited— something holds us back from fully cooperating.
The authors of the study suggest that the reluctance might be the result of a psychological development that makes us averse to any kind of extreme behavior, either selfish or selfless, even if the rules are set up to reward the extreme.
This may be advantageous in the sense that extreme strategies could be costly if we misunderstood the rules, or if the rules change without warning. The authors even suggest that our brains may intentionally fire some kind of calculation error in extreme situations to prevent hazardous "all-in" commitments. In any case, our preferences in collaborative situations are clear—just not absolute.
Source: ars technica