Online Book Reader

Home Category

How We Believe_ Science and the Search for God - Michael Shermer [105]

By Root 493 0
fitness of those outside the group,” demonstrate that “natural selection can operate at more than one level of the biological hierarchy.” They show how “individual selection favors traits that maximize relative fitness within single groups,” and that “group selection favors traits that maximize the relative fitness of groups.” Of course, “altruism is maladaptive with respect to individual selection but adaptive with respect to group selection.” Therefore, they conclude, “altruism can evolve if the process of group selection is sufficiently strong.” For example, they cite William Hamilton’s analysis of how consciousness might have provided a group selective advantage for certain human populations with regard to the ethical enforcement of rules: “Consider also the selective value of having a conscience. The more consciences are lacking in a group as a whole, the more energy the group will need to divert to enforcing otherwise tacit rules or else face dissolution. Thus considering one step (individual vs. group) in a hierarchical population structure, having a conscience is an ‘altruistic’ character.”

Part of the problem in this debate is in how certain terms are defined, such as altruism and cooperation, and the tendency to force these categories into either-or choices for human actions. Humans are either altruistic or selfish. Humans are either cooperative or competitive. But altruistic and cooperative are not reified things, they are behaviors. And like all behaviors, there is a broad range of expression, from a little to a lot. Applying fuzzy logic can help clarify this complex human phenomenon, where we might assign fuzzy numbers to altruism or cooperation. Depending on the circumstances, someone might be, say, .2 altruistic and .8 nonaltruistic (or selfish), or .6 cooperative and .4 noncooperative (or competitive). Humans can be both altruistic and nonaltruistic, cooperative and noncooperative.

One problem with reciprocal altruism is this: How do I know that if I scratch your back you will scratch mine? I am more than willing to cooperate with unrelated members of my community, but only if I am reasonably certain that they are going to reciprocate. How can I find out who are the cooperators and who are the defectors? Gossip is one way. Past experience with my fellow community members is another. Combined, these give me enough information to make a decision (even if it is on an unconscious level) about whom I can trust.

In a way, daily life can be modeled by a game theory technique called the Prisoner’s Dilemma. Two individuals who cooperated in committing a crime are caught, arrested, and offered the chance of a reduced sentence if one will rat out the other. The district attorney can convict both of them of a minor offense, but if one of them confesses, he can go free while the other rots in jail with a long sentence. What will they do? It depends on their respective reputations for being trustworthy. Let us simplify the game where each player gets one point if both cooperate, either one can get two points by defecting when the other cooperates, and zero points if both defect. When only one round of the game is played, most people defect. But when the game is iterated, or repeated for numerous rounds with the same players, cooperation is the norm. When you learn that your partner is a cooperator and not a defector, you become a cooperator yourself.

To test this hypothesis the mathematician and political scientist Robert Axelrod held a contest by inviting people to submit a computer program to play the iterated Prisoner’s Dilemma. Pitting the programs against each other for 200 games each. he tallied up the payoff scores and found that the winning program was the simplest one, designed by Anatol Rapoport and called Tit for Tat. The program chooses to cooperate on the first round, and then on all subsequent moves it matches the choice of its opponent. Tit for Tat, says evolutionary biologist John Maynard Smith, is an Evolutionary Stable Strategy, or “a strategy such that, if all the members of a population adopt it,

Return Main Page Previous Page Next Page

®Online Book Reader