So, as I’ve heard Mike Munger explain it, fairness is evolution’s solution to the equilibrium outcome selection problem. “Solution to the what?” you ask. This would be easy to explain if you’re familiar with the Edgeworth box.
In a simplified economy consisting of two people and two goods, where the two people have some combination of different tastes and different initial baskets of things. Suppose that you have 20 oranges and 5 apples, and that I have 3 oranges and 30 apples, and that we each prefer more even numbers of fruits than either extreme. We can trade apples and oranges to make each of us strictly better off, but there’s a whole continuum of possible trades that make us better off. And with your highly advanced social brain, you can tell that some of these trades are shit deals, like when I offer you 1 apple for 12 of your oranges. Even though we’d both mutually benefit, you’d be inclined to immediately counteroffer with something a closer to the middle of the continuum of mutually beneficial exchanges, or a point that benefits you more as a reprimand for my being a jerk. Dealing fairly with each other skips costly repeated bargaining, and standing up to jerks who deviate from approximate fairness preserves the norm.
This is the sort of intuition that we’re trying to test for in the Ultimatum game.
So, as I’ve heard Mike Munger explain it, fairness is evolution’s solution to the equilibrium outcome selection problem. “Solution to the what?” you ask. This would be easy to explain if you’re familiar with the Edgeworth box.
In a simplified economy consisting of two people and two goods, where the two people have some combination of different tastes and different initial baskets of things. Suppose that you have 20 oranges and 5 apples, and that I have 3 oranges and 30 apples, and that we each prefer more even numbers of fruits than either extreme. We can trade apples and oranges to make each of us strictly better off, but there’s a whole continuum of possible trades that make us better off. And with your highly advanced social brain, you can tell that some of these trades are shit deals, like when I offer you 1 apple for 12 of your oranges. Even though we’d both mutually benefit, you’d be inclined to immediately counteroffer with something a closer to the middle of the continuum of mutually beneficial exchanges, or a point that benefits you more as a reprimand for my being a jerk. Dealing fairly with each other skips costly repeated bargaining, and standing up to jerks who deviate from approximate fairness preserves the norm.
This is the sort of intuition that we’re trying to test for in the Ultimatum game.