I agree that probabilities are defined through wagers. I also think beliefs (or really, degrees of belief) are defined through wagers. That’s the way Bayesian epistemologists usually define degree of belief. So I believe X will occur with P = .5 iff a wager on X and a wager on a fair coin flip are equally preferable to me.
I’m not sure I have an official position of Bayesian epistemology but I find the problem very confusing until you tell me what the payoff is. One might make an educated guess at the kind of payoff system the experiment designers would have had in mind—as many in the this thread have done. (ETA: actually, you probably have to weigh your answer according to your degree of belief in the interpretation you’ve chosen. Which is of course ridiculous. Lets just include the payoff scheme in the experiment.)
I agree that more information would help the beauty, but I’m more
interested in the issue of whether or not the question, as stated, is
ill-posed.
One of the Bayesian vs. frequentist examples that I found most
interesting was the case of the coin with unknown bias—a Bayesian
would say it has 50% chance of coming up heads, but a frequentist would
refuse to assign a probability. I was wondering if perhaps this is an
analogous case for Bayesians.
That wouldn’t necessarily mean anything is wrong with Bayesianism.
Everyone has to draw the line somewhere, and it’s good to know where.
I just flipped a coin. Are you willing to offer me a wager on the outcome I have already seen? Yet tradition would suggest you have a degree of belief in the most probable possibilities.
The offering of the wager itself can act as useful information. Some people wager to win.
I see what you mean. Yes, actual, literal, wagers are messier than beliefs. Another example is a bet that the world is going to end: which you should obviously always bet against at any odds even if you believe the last days are upon us. The equivalence between degree of belief and fair betting odds is a more abstract equivalence with an idealized bookie who offers bets on everything, doesn’t take a cut for himself and pays out even if you’re dead.
Actually, I like that metaphor! Let me work this out:
The bookie would see heads and tails with equal probability. However, the bookie would also sees twice as many bets when tails comes up. In order to make the vig zero, the bookie should pay out as much as comes in for whichever bet comes up, and that works out to 1:2 on heads and 2:1 on tails! Thus, the bookie sets the probability for Beauty at 1⁄3.
Another example is a bet that the world is going to end: which you should obviously always bet against at any odds even if you believe the last days are upon us.
To make an end of the world bet, person A who believes the world is not about to end will give some money to person B who believes the world is about to end. If after an agreed upon time, it is observed that the world has not ended, person B then gives a larger amount of money to person A.
It is harder to recover probabilities from the bets of this form that people are willing to make, because interest rates are a confounding factor.
Bets with money assume fairly constant and universal utility/$ rate. But that can’t be assumed in this case since money isn’t worth nearly as much if the world is about to end.
So you’d have to adjust for that. And of course even if you can figure out a fair wager given this issue it won’t be equivalent to the right degree of belief.
It is harder to recover probabilities from the bets of this form that people are willing to make, because interest rates are a confounding factor.
It isn’t that hard, is it? We just find the interest rate on the amount B got to begin with, right?
But if person B is right she only gets to enjoy the money until the world ends. It seems to me that money is less valuable when you can only derive utility from it for a small, finite period of time. You can’t get your money’s worth buying a house, for example. Plus if belief in the end of the world is widespread the economy will get distorted in a bunch of ways (in particular, the best ways to spend money with two weeks left to live would get really expensive) making it really hard to figure out what the fair bet would be.
I agree that probabilities are defined through wagers. I also think beliefs (or really, degrees of belief) are defined through wagers. That’s the way Bayesian epistemologists usually define degree of belief. So I believe X will occur with P = .5 iff a wager on X and a wager on a fair coin flip are equally preferable to me.
That’s fine. I guess I’m just not a Bayesian epistemologist.
If Sleeping Beauty is a Bayesian epistemologist, does that mean she refuses to answer the question as asked?
I’m not sure I have an official position of Bayesian epistemology but I find the problem very confusing until you tell me what the payoff is. One might make an educated guess at the kind of payoff system the experiment designers would have had in mind—as many in the this thread have done. (ETA: actually, you probably have to weigh your answer according to your degree of belief in the interpretation you’ve chosen. Which is of course ridiculous. Lets just include the payoff scheme in the experiment.)
I agree that more information would help the beauty, but I’m more interested in the issue of whether or not the question, as stated, is ill-posed.
One of the Bayesian vs. frequentist examples that I found most interesting was the case of the coin with unknown bias—a Bayesian would say it has 50% chance of coming up heads, but a frequentist would refuse to assign a probability. I was wondering if perhaps this is an analogous case for Bayesians.
That wouldn’t necessarily mean anything is wrong with Bayesianism. Everyone has to draw the line somewhere, and it’s good to know where.
I can understand that, but the fact that a wager has been offered distorts the probabilities under a lot of circumstances.
How do you mean?
I just flipped a coin. Are you willing to offer me a wager on the outcome I have already seen? Yet tradition would suggest you have a degree of belief in the most probable possibilities.
The offering of the wager itself can act as useful information. Some people wager to win.
I see what you mean. Yes, actual, literal, wagers are messier than beliefs. Another example is a bet that the world is going to end: which you should obviously always bet against at any odds even if you believe the last days are upon us. The equivalence between degree of belief and fair betting odds is a more abstract equivalence with an idealized bookie who offers bets on everything, doesn’t take a cut for himself and pays out even if you’re dead.
Actually, I like that metaphor! Let me work this out:
The bookie would see heads and tails with equal probability. However, the bookie would also sees twice as many bets when tails comes up. In order to make the vig zero, the bookie should pay out as much as comes in for whichever bet comes up, and that works out to 1:2 on heads and 2:1 on tails! Thus, the bookie sets the probability for Beauty at 1⁄3.
To make an end of the world bet, person A who believes the world is not about to end will give some money to person B who believes the world is about to end. If after an agreed upon time, it is observed that the world has not ended, person B then gives a larger amount of money to person A.
It is harder to recover probabilities from the bets of this form that people are willing to make, because interest rates are a confounding factor.
Bets with money assume fairly constant and universal utility/$ rate. But that can’t be assumed in this case since money isn’t worth nearly as much if the world is about to end.
So you’d have to adjust for that. And of course even if you can figure out a fair wager given this issue it won’t be equivalent to the right degree of belief.
It isn’t that hard, is it? We just find the interest rate on the amount B got to begin with, right?
But if person B is right she only gets to enjoy the money until the world ends. It seems to me that money is less valuable when you can only derive utility from it for a small, finite period of time. You can’t get your money’s worth buying a house, for example. Plus if belief in the end of the world is widespread the economy will get distorted in a bunch of ways (in particular, the best ways to spend money with two weeks left to live would get really expensive) making it really hard to figure out what the fair bet would be.