Let’s reverse this and see if it makes more sense. Say I give you a die that looks normal, but you have no evidence about whether it’s fair. Then I offer you a two-sided bet: I’ll bet $101 to your $100 that it comes up odd. I’ll also offer $101 to your $100 that it comes up even. Assuming that transaction costs are small, you would take both bets, right?
If you had even a small reason to believe that the die was weighted towards even numbers, on the other hand, you would take one of those bets but not the other. So if you take both, you are exhibiting a probability estimate of exactly 50%, even though it is “uncertain” in the sense that it would not to make evidence to move that estimate.
Huh? If I take both bets, there is the certain outcome of me winning $1 and that involves no risk at all (well, other than the possibility that this die is not a die but a pun and the act of rolling it opens a transdimensional portal to the nether realm...)
True, you’re sure to make money if you take both bets. But if you think the probability is 51% on odd rather than 50%, you make a better expected value by only taking one side.
The thing, is, I’m perfectly willing to accept the answer “I don’t know”. How will I bet? I will not bet.
There is a common idea that “I don’t know” necessarily implies a particular (usually uniform) distribution over all the possible values. I don’t think this is so.
You will not bet on just one side, you mean. You already said you’ll take both bets because of the guaranteed win. But unless your credence is quite precisely 50%, you could increase your expected value over that status quo (guaranteed $1) by choosing NOT to take one of the bets. If you still take both, or if you now decide to take neither, it seems clear that loss aversion is the reason (unless the amounts are so large that decreasing marginal value has a significant effect).
I’m not sure what you mean here by risk aversion. If it’s not loss aversion, and it’s not due to decreasing marginal value, what is left?
Would you rather have $5 than a 50% chance of getting $4 and a 50% chance of getting $7? That, to me, sounds like the kind of risk aversion you’re describing, but I can’t think of a reason to want that.
Would you rather have $5 than a 50% chance of getting $4 and a 50% chance of getting $7? That, to me, sounds like the kind of risk aversion you’re describing, but I can’t think of a reason to want that.
Let me give you an example. You are going to the theater to watch the first showing of a movie you really want to see. At the ticket booth you discover that you forgot your wallet and can’t pay the ticket cost of $5. A bystander offers to help you, but because he’s a professor of decision science he offers you a choice: a guaranteed $5, or a 50% chance of $4 and a 50% chance of $7. What do you pick?
That’s a great example, but it goes both ways. If the professor offered you a choice between guaranteed $4 and a 50% chance between $5 and $2, you’d be averse to certainty instead (and even pay some expected money for the privilege). Both kinds of scenarios should happen equally often, so it can’t explain why people are risk-averse overall.
Both kinds of scenarios should happen equally often
Not in real life, they don’t.
People planning future actions prefer the certainty of having the necessary resources on hand at the proper time. Crudely speaking, that’s what planning is. If the amount of resources that will be available is uncertain, people often prefer to create that certainty by getting enough resources so that the amount at the lower bound is sufficient—and that involves paying the price of getting more (in the expectation) than you need.
Because people do plan, the situation of “I’ll pick the sufficient and certain amount over a chance to lose and a chance to win” occurs much more often than “I certainly have insufficient resources, so a chance to win is better than no chance at all”.
Let’s reverse this and see if it makes more sense. Say I give you a die that looks normal, but you have no evidence about whether it’s fair. Then I offer you a two-sided bet: I’ll bet $101 to your $100 that it comes up odd. I’ll also offer $101 to your $100 that it comes up even. Assuming that transaction costs are small, you would take both bets, right?
If you had even a small reason to believe that the die was weighted towards even numbers, on the other hand, you would take one of those bets but not the other. So if you take both, you are exhibiting a probability estimate of exactly 50%, even though it is “uncertain” in the sense that it would not to make evidence to move that estimate.
Huh? If I take both bets, there is the certain outcome of me winning $1 and that involves no risk at all (well, other than the possibility that this die is not a die but a pun and the act of rolling it opens a transdimensional portal to the nether realm...)
True, you’re sure to make money if you take both bets. But if you think the probability is 51% on odd rather than 50%, you make a better expected value by only taking one side.
The thing, is, I’m perfectly willing to accept the answer “I don’t know”. How will I bet? I will not bet.
There is a common idea that “I don’t know” necessarily implies a particular (usually uniform) distribution over all the possible values. I don’t think this is so.
You will not bet on just one side, you mean. You already said you’ll take both bets because of the guaranteed win. But unless your credence is quite precisely 50%, you could increase your expected value over that status quo (guaranteed $1) by choosing NOT to take one of the bets. If you still take both, or if you now decide to take neither, it seems clear that loss aversion is the reason (unless the amounts are so large that decreasing marginal value has a significant effect).
From my point of view it’s not a bet—there is no uncertainty involved—I just get to collect $1.
Not loss aversion—risk aversion. And yes, in most situations most humans are risk averse. There are exceptions—e.g. lotteries and gambling in general.
I’m not sure what you mean here by risk aversion. If it’s not loss aversion, and it’s not due to decreasing marginal value, what is left?
Would you rather have $5 than a 50% chance of getting $4 and a 50% chance of getting $7? That, to me, sounds like the kind of risk aversion you’re describing, but I can’t think of a reason to want that.
Aversion to uncertainty :-)
Let me give you an example. You are going to the theater to watch the first showing of a movie you really want to see. At the ticket booth you discover that you forgot your wallet and can’t pay the ticket cost of $5. A bystander offers to help you, but because he’s a professor of decision science he offers you a choice: a guaranteed $5, or a 50% chance of $4 and a 50% chance of $7. What do you pick?
That’s a great example, but it goes both ways. If the professor offered you a choice between guaranteed $4 and a 50% chance between $5 and $2, you’d be averse to certainty instead (and even pay some expected money for the privilege). Both kinds of scenarios should happen equally often, so it can’t explain why people are risk-averse overall.
Not in real life, they don’t.
People planning future actions prefer the certainty of having the necessary resources on hand at the proper time. Crudely speaking, that’s what planning is. If the amount of resources that will be available is uncertain, people often prefer to create that certainty by getting enough resources so that the amount at the lower bound is sufficient—and that involves paying the price of getting more (in the expectation) than you need.
Because people do plan, the situation of “I’ll pick the sufficient and certain amount over a chance to lose and a chance to win” occurs much more often than “I certainly have insufficient resources, so a chance to win is better than no chance at all”.