If I give one person X amount of money, I can do this N times for N different people, producing on average U(X) utility each time.
If I give N people X/N amount of money, and do this N times for the same N people, I must be producing, on average, the same U(X) of utility, since I have produced the same result with the same number of iterations.
Do you expect:
That the first person I give money to will benefit more than the average person?
or
That the first penny I give people will benefit less than the average penny?
It’s not controversial that the marginal utility of money decreases when you’ve got a lot of it. Another dollar is worth less to a millionaire than a beggar. But you also have a lot more than a hundred times as many purchasing options with a dollar than with a penny, so I think it’s likely that the utility of money is described by an S curve.
This would make sense if everyone started with $0, or an exact round number of dollars, which seems like an odd scenario. However, if that’s your assumption, I agree with you.
I suppose I incorporated that assumption without really thinking about it. Now that you mention it, that would be a pretty odd situation.
Depending where on the the curve the slope levels off, it might still result in greater utility to give one person a lot of money than to distribute the same amount of money in one cent units to very many people, but if you assume that a majority of the people are already middle class, that’s probably not the case.
I think the confusion here is more of a sorites paradox than anything about expected utility. You can’t imagine a single extra penny changing anyone’s outcome, but you can imagine it for a hundred extra pennies.
I think that’s a mistaken intuition; for instance, you could be about to buy something small, realize you’re a few cents short, and either cancel the transaction or use the “Take a Penny” jar; but the probabilities of doing either will actually change (for social reasons) depending on whether you’re, say, 3, 4 or 5 cents short of the total.
It’s rather like the way that being one more second late out the door can either get you to the office at the exact same time, or twenty minutes later, depending on the bus schedule.
Utility clearly isn’t linear with money, but I think you’re probably right that that intuition had something to do with my drawing the conclusion I did.
If I give one person X amount of money, I can do this N times for N different people, producing on average U(X) utility each time.
If I give N people X/N amount of money, and do this N times for the same N people, I must be producing, on average, the same U(X) of utility, since I have produced the same result with the same number of iterations.
Do you expect:
That the first person I give money to will benefit more than the average person?
or
That the first penny I give people will benefit less than the average penny?
If so, why?
The second.
It’s not controversial that the marginal utility of money decreases when you’ve got a lot of it. Another dollar is worth less to a millionaire than a beggar. But you also have a lot more than a hundred times as many purchasing options with a dollar than with a penny, so I think it’s likely that the utility of money is described by an S curve.
This would make sense if everyone started with $0, or an exact round number of dollars, which seems like an odd scenario. However, if that’s your assumption, I agree with you.
I suppose I incorporated that assumption without really thinking about it. Now that you mention it, that would be a pretty odd situation.
Depending where on the the curve the slope levels off, it might still result in greater utility to give one person a lot of money than to distribute the same amount of money in one cent units to very many people, but if you assume that a majority of the people are already middle class, that’s probably not the case.
I think the confusion here is more of a sorites paradox than anything about expected utility. You can’t imagine a single extra penny changing anyone’s outcome, but you can imagine it for a hundred extra pennies.
I think that’s a mistaken intuition; for instance, you could be about to buy something small, realize you’re a few cents short, and either cancel the transaction or use the “Take a Penny” jar; but the probabilities of doing either will actually change (for social reasons) depending on whether you’re, say, 3, 4 or 5 cents short of the total.
It’s rather like the way that being one more second late out the door can either get you to the office at the exact same time, or twenty minutes later, depending on the bus schedule.
Utility clearly isn’t linear with money, but I think you’re probably right that that intuition had something to do with my drawing the conclusion I did.