Even more than the easier problem of remembering faces and matching them to favors, the ability of both parties to agree with sufficient accuracy on an estimate of the value of a favor in the first place is probably the main barrier to reciprocal altruism among animals. It is also likely the most important barrier to exchange among humans. Many kinds of exchange, probably many more than most economists perceive, are rendered infeasible by the inability of one or both parties to the exchange to estimate its value.
Downvoted. Exchange does not require a common estimate of “value”, although reciprocal altruism probably does. Rational agents will undertake all exchanges which make both of them better off according to each agent’s utility function. Assuming TDT, agents which are similar to each other will also reach a Pareto optimum in a bilateral monopoly game.
Humans might sometimes be unable to agree to an exchange in a bilateral monopoly, but that need not imply any disagreement about “value”: for instance, they might disagree about bargaining positions, or using brinkmanship to extract concessions from other parties.
-Nick Szabo
Downvoted. Exchange does not require a common estimate of “value”, although reciprocal altruism probably does. Rational agents will undertake all exchanges which make both of them better off according to each agent’s utility function. Assuming TDT, agents which are similar to each other will also reach a Pareto optimum in a bilateral monopoly game.
Humans might sometimes be unable to agree to an exchange in a bilateral monopoly, but that need not imply any disagreement about “value”: for instance, they might disagree about bargaining positions, or using brinkmanship to extract concessions from other parties.
This seems like anthropomorphic pessimism.
See: http://en.wikipedia.org/wiki/Stotting#Purpose, http://en.wikipedia.org/wiki/Signalling_theory, http://www.cracked.com/article_19456_8-things-you-wont-believe-plants-do-when-no-ones-looking_p2.html (especially #1) and http://lesswrong.com/lw/st/anthropomorphic_optimism/.