This has nothing to do with semantics. If smart people are saying “2+2=5” and I point out it’s 4, would you say “what matters is why you want to know what 2+2 is”?
The question here is very well defined. There is only one right answer. The fact that even very smart people come up with the wrong answer has all kinds of implications about the type of errors we might make on a regular basis (and lead to bad theories, decisions, etc).
If you mean something else by probability than “at what odds would you be indifferent to accepting a bet on this proposition” then you need to explain what you mean. You are just coming across as confused. You’ve already acknowledged that sleeping beauty would be wrong to turn down a 50:50 bet on tails. What proposition is being bet on when you would be correct to be indifferent at 50:50 odds?
There is a mismatch between the betting question and the original question about probability.
At an awakening, she has no more information about heads or tails than she had originally, but we’re forcing her to bet twice under tails. So, even if her credence for heads was a half, she still wouldn’t make the bet.
Suppose I am going to flip a coin and I tell you you win $1 if heads and lose $2 if tails. You could calculate that the p(H) would have to be 2⁄3 in order for this to be a fair bet (have 0 expectation). That doesn’t imply that the p(H) is actually 2⁄3. It’s a different question. This is a really important point, a point that I think has caused much confusion.
Do you think this analysis works for the fact that a well-calibrated Beauty answers “1/3”? Do you think there’s a problem with our methods of judging calibration?
You seem to agree she should take a 50:50 bet on tails. What would be the form of the bet where she should be indifferent to 50:50 odds? If you can answer this question and explain why you think it is a more relevant probability then you may be able to resolve the confusion.
Roko has already given an example of such a bet: where she only gets one pay out in the tails case. Is this what you are claiming is the more relevant probability? If so, why is this probability more relevant in your estimation?
The interviewer asks about her credence ‘right now’ (at an awakening). If we want to set up a betting problem based around that decision, why would it involve placing bets on possibly two different days?
If, at an awakening, Beauty really believes that it’s tails with credence 0.67, then she would gladly take a single bet of win $1 if tails and lose $1.50 if heads. If she wouldn’t take that bet, why should we believe that her credence for heads at an awakening is 1/3?
I’m treating credence for heads as her confidence in heads, as expressed as a number between 0 and 1 (inclusive), given everything she knows at the time. I see it as the same things as a posterior probability.
I don’t think disagreement is due to different uses of the word credence. It appears to me that we are all talking about the same thing.
I think that the difference between evaluating 2+2 and assigning probabilities (and the reason for the large amount of disagreement) is that 2+2 is a statement in a formal language, whereas what kind of anthropic principle to accept/how to interpret probability is a philosophical one.
Don’t be fooled by the simple Bayes’ theorem calculations—they are not the hard part of this question.
A philosophical question, as opposed to a formal one, is a question that hasn’t been properly understood yet. It is a case of ignorance in the mind, not a case of fuzzy territory.
If smart people are saying “2+2=5” and I point out it’s 4, would you say “what matters is why you want to know what 2+2 is”?
Yes. For example, let’s take a clearer mathematical statement, “3 is prime”. It seems that’s true whatever people say. However, if you come across some mathematicians who are having a discussion that assumes 3 is not prime, then you should think you’re missing some context rather than that they are bad at math.
I chose this example because I once constructed an integer-like system based on half-steps (the successor function adds .5). The system has a notion of primality, and 3 is not prime.
If you want a standard system where 3 is not prime consider Z[omega] where omega^3=1 and omega is not 1. That is, the set of numbers formed by taking all sums, differences, and products of 1 and omega.
What you should say when asked “What is 2+2?” is a separate question from what is 2+2. 2+2 is 4, but you should probably say something else if the situation calls to that. The circumstances that could force you to say something in response to a given question are unrelated to what the answer to that question really is. The truth of the answer to a question is implicit in the question, not in the question-answering situation, unless the question is about the question-answering situation.
“Should” refers to moral value of the outcome, and if someone is holding a gun to a puppy’s head and says “if you say that 2+2=4, the puppy will die!”, you shouldn’t answer “4” to the question, even though it’s correct that the answer is 4. Correctness is a concept separate from shouldness.
If someone asks you, “What do you get if you add 2 and 2”, and you are aware that if you answer “4“ he’ll shoot the puppy and if you answer “5” then he’ll let you and the puppy go, then the correct answer is “5”.
You are disputing definitions. You seem to include “should” among the possible meanings of “correct”. When you say, “in this situation, the correct answer is 5”, you refer to the “correctness” of the answer “5”, not to the correctness of 2+2 being 5. Thus, we are talking about an action, not about the truth of 2+2. The action can, for example, be judged according to moral value of its outcome, which is what you seem to mean by “correct” [action].
Thus, in this terminology, “5” is the correct answer, while it’s also correct that the [true] answer is 4. When I say just “the answer is 4“, this is a shorthand for “the true answer is 4”, and doesn’t refer to the actual action, for which it’s true that “the [actual] answer is 5”.
Right, so for some arbitrary formal system, you can derive “4” from “2+2“, and for some other one, you can derive “5” from “2+2”, and in other situations, the correct response to “2+2” is “tacos”.
When you ask “What is 2+2?”, you mean a specific class of formal systems, not an “arbitrary formal system”. The subject matter is fixed by the question, the truth of its answer doesn’t refer to the circumstances of answering it, to situations where you decide what utterance to produce in response.
The truth might be a strategy conditional on the situation in which you answer it, one that could be correctly followed given the specific situation, but that strategy is itself fixed by the question.
For example, I might ask “What should you say when asked the value of 2+2, taking into account the possibility of being threatened by puppy’s death if you say something other than 5?” The correct answer to that question is a strategy where you say “4″ unless puppy’s life is in danger, in which case you say “5”. Note that the strategy is still fixed by the question, even though your action differs with situation in which you carry it out; your action correctly brings about the truth of the answer to the question.
Given that Beauty is being asked the question, the probability that heads had come up is 1⁄3. This doesn’t mean the probability of heads itself is 1⁄3. So I think this is a confusion about what the question is asking. Is the question asking what is the probability of heads, or what is the probability of heads given an awakening?
Bayes theorem:
x = # of times awakened after heads
y = # of times awakened after tails
p(heads/awakened) = n(heads and awakened) / n(awakened) = x / (x+y)
Yields 1⁄3 when x=1 and y=2.
Where is the probability of heads? Actually we already assumed in the calculation above that p(heads) = 0.5. For a general biased coin, the calculation is slightly more complex:
I’m leaving this comment because I think the equations help explain how the probability-of-heads and the probability-of-heads-given-awakening are inter-related but, obviously—I know you know this already—not the same thing.
To clarify, since the probability-of-heads and the probability-of-heads-given-single-awakening-event are different things, it is indeed a matter of semantics: if Beauty is asked about the probability of heads per event … what is the event? Is the event the flip of the coin (p=1/2) or an awakening (p=1/3)? In the post narrative, this remains unclear.
Which event is meant would become clear if it was a wager (and, generally, if anything whatsoever rested on the question). For example: if she is paid per coin flip for being correct (event=coin flip) then she should bet heads to be correct 1 out of 2 times; if she is paid per awakening for being correct (event=awakening) then she should bet tails to be correct 2 out of 3 times.
Actually .. arguing with myself now .. Beauty wasn’t asked about a probability, she was asked if she thought heads had been flipped, in the past. So this is clear after all—did she think heads was flipped, or not?
Viewing it this way, I see the isomorphism with the class of anthropic arguments that ask if you can deduce something about the longevity of humans given that you are an early human. (Being a human in a certain century is like awakening on a certain day.) I suppose then my solution should be the same. Waking up is not evidence either way that heads or tails was flipped. Since her subjective experience is the same however the coin is flipped (she wakes up) she cannot update upon awakening that it is more likely that tails was flipped. Not even if flipping tails means she wakes up 10 billion times more than if heads was flipped.
However, I will think longer if there are any significant differences between the two problems. Thoughts?
Why was this comment down-voted so low? (I rarely ask, but this time I can’t guess.) Is it too basic math? If people are going to argue whether 1⁄3 or 1⁄2, I think it is useful to know their debating about two different probabilities: the probability of heads or the probability of heads given an awakening.
By “awakened” here you mean “awakened at all”. I think you’ve shown already that the probability that heads was flipped given that she was awakened at all is 1⁄2, since in both cases she’s awakened at all and the probability of heads is 1⁄2. I think your dispute is with people who don’t think “I was awakened at all” is all that Beauty knows when she wakes up.
Beauty also knows how many times she it likely to have been woken up when the coin lands heads—and the same for tails. She knew that from the start of the experiment.
OK, I see now why you are emphasizing being awoken at all. That is the relevant event, because that is exactly what she experiences and all that she has to base her decision upon.
(But keep in mind that people are just busy answering different questions, they’re not necessarily incorrect for answering a different question.)
This has nothing to do with semantics. If smart people are saying “2+2=5” and I point out it’s 4, would you say “what matters is why you want to know what 2+2 is”?
The question here is very well defined. There is only one right answer. The fact that even very smart people come up with the wrong answer has all kinds of implications about the type of errors we might make on a regular basis (and lead to bad theories, decisions, etc).
If you mean something else by probability than “at what odds would you be indifferent to accepting a bet on this proposition” then you need to explain what you mean. You are just coming across as confused. You’ve already acknowledged that sleeping beauty would be wrong to turn down a 50:50 bet on tails. What proposition is being bet on when you would be correct to be indifferent at 50:50 odds?
There is a mismatch between the betting question and the original question about probability.
At an awakening, she has no more information about heads or tails than she had originally, but we’re forcing her to bet twice under tails. So, even if her credence for heads was a half, she still wouldn’t make the bet.
Suppose I am going to flip a coin and I tell you you win $1 if heads and lose $2 if tails. You could calculate that the p(H) would have to be 2⁄3 in order for this to be a fair bet (have 0 expectation). That doesn’t imply that the p(H) is actually 2⁄3. It’s a different question. This is a really important point, a point that I think has caused much confusion.
Do you think this analysis works for the fact that a well-calibrated Beauty answers “1/3”? Do you think there’s a problem with our methods of judging calibration?
You seem to agree she should take a 50:50 bet on tails. What would be the form of the bet where she should be indifferent to 50:50 odds? If you can answer this question and explain why you think it is a more relevant probability then you may be able to resolve the confusion.
Roko has already given an example of such a bet: where she only gets one pay out in the tails case. Is this what you are claiming is the more relevant probability? If so, why is this probability more relevant in your estimation?
Yes, one pay out is the relevant case. The reason is because we are asking about her credence at an awakening.
How does the former follow from the latter, exactly? I seem to need that spelled out.
The interviewer asks about her credence ‘right now’ (at an awakening). If we want to set up a betting problem based around that decision, why would it involve placing bets on possibly two different days?
If, at an awakening, Beauty really believes that it’s tails with credence 0.67, then she would gladly take a single bet of win $1 if tails and lose $1.50 if heads. If she wouldn’t take that bet, why should we believe that her credence for heads at an awakening is 1/3?
What do you think the word “credence” means? I am thinking that perhaps that is the cause of your problems.
I’m treating credence for heads as her confidence in heads, as expressed as a number between 0 and 1 (inclusive), given everything she knows at the time. I see it as the same things as a posterior probability.
I don’t think disagreement is due to different uses of the word credence. It appears to me that we are all talking about the same thing.
I think that the difference between evaluating 2+2 and assigning probabilities (and the reason for the large amount of disagreement) is that 2+2 is a statement in a formal language, whereas what kind of anthropic principle to accept/how to interpret probability is a philosophical one.
Don’t be fooled by the simple Bayes’ theorem calculations—they are not the hard part of this question.
So the difficult question here is which probability space to set up, not how to compute conditional probabilities given that probability space.
(Posted as an antidote to misinterpretation of your comment I committed a moment before.)
A philosophical question, as opposed to a formal one, is a question that hasn’t been properly understood yet. It is a case of ignorance in the mind, not a case of fuzzy territory.
Yes. For example, let’s take a clearer mathematical statement, “3 is prime”. It seems that’s true whatever people say. However, if you come across some mathematicians who are having a discussion that assumes 3 is not prime, then you should think you’re missing some context rather than that they are bad at math.
I chose this example because I once constructed an integer-like system based on half-steps (the successor function adds .5). The system has a notion of primality, and 3 is not prime.
If you want a standard system where 3 is not prime consider Z[omega] where omega^3=1 and omega is not 1. That is, the set of numbers formed by taking all sums, differences, and products of 1 and omega.
What you should say when asked “What is 2+2?” is a separate question from what is 2+2. 2+2 is 4, but you should probably say something else if the situation calls to that. The circumstances that could force you to say something in response to a given question are unrelated to what the answer to that question really is. The truth of the answer to a question is implicit in the question, not in the question-answering situation, unless the question is about the question-answering situation.
I disagree. The correct answer to a question is exactly what you should answer to that question. It’s what “correct” and “should” mean.
“Should” refers to moral value of the outcome, and if someone is holding a gun to a puppy’s head and says “if you say that 2+2=4, the puppy will die!”, you shouldn’t answer “4” to the question, even though it’s correct that the answer is 4. Correctness is a concept separate from shouldness.
If someone asks you, “What do you get if you add 2 and 2”, and you are aware that if you answer “4“ he’ll shoot the puppy and if you answer “5” then he’ll let you and the puppy go, then the correct answer is “5”.
You are disputing definitions. You seem to include “should” among the possible meanings of “correct”. When you say, “in this situation, the correct answer is 5”, you refer to the “correctness” of the answer “5”, not to the correctness of 2+2 being 5. Thus, we are talking about an action, not about the truth of 2+2. The action can, for example, be judged according to moral value of its outcome, which is what you seem to mean by “correct” [action].
Thus, in this terminology, “5” is the correct answer, while it’s also correct that the [true] answer is 4. When I say just “the answer is 4“, this is a shorthand for “the true answer is 4”, and doesn’t refer to the actual action, for which it’s true that “the [actual] answer is 5”.
Right, so for some arbitrary formal system, you can derive “4” from “2+2“, and for some other one, you can derive “5” from “2+2”, and in other situations, the correct response to “2+2” is “tacos”.
When you ask “What is 2+2?”, you mean a specific class of formal systems, not an “arbitrary formal system”. The subject matter is fixed by the question, the truth of its answer doesn’t refer to the circumstances of answering it, to situations where you decide what utterance to produce in response.
The truth might be a strategy conditional on the situation in which you answer it, one that could be correctly followed given the specific situation, but that strategy is itself fixed by the question.
For example, I might ask “What should you say when asked the value of 2+2, taking into account the possibility of being threatened by puppy’s death if you say something other than 5?” The correct answer to that question is a strategy where you say “4″ unless puppy’s life is in danger, in which case you say “5”. Note that the strategy is still fixed by the question, even though your action differs with situation in which you carry it out; your action correctly brings about the truth of the answer to the question.
Given that Beauty is being asked the question, the probability that heads had come up is 1⁄3. This doesn’t mean the probability of heads itself is 1⁄3. So I think this is a confusion about what the question is asking. Is the question asking what is the probability of heads, or what is the probability of heads given an awakening?
Bayes theorem:
x = # of times awakened after heads
y = # of times awakened after tails
p(heads/awakened) = n(heads and awakened) / n(awakened) = x / (x+y)
Yields 1⁄3 when x=1 and y=2.
Where is the probability of heads? Actually we already assumed in the calculation above that p(heads) = 0.5. For a general biased coin, the calculation is slightly more complex:
p(H) =probability of heads
p(T) = probability of tails
x = # of times awakened after heads
y = # of times awakened after tails
p(heads/awakened) = n(heads and awakened) / n(awakened) = p(H)x / (p(H)x + p(T)y)
Yields 1⁄3 when x=1 and y=2 and p(H)=p(T)=0.5.
I’m leaving this comment because I think the equations help explain how the probability-of-heads and the probability-of-heads-given-awakening are inter-related but, obviously—I know you know this already—not the same thing.
To clarify, since the probability-of-heads and the probability-of-heads-given-single-awakening-event are different things, it is indeed a matter of semantics: if Beauty is asked about the probability of heads per event … what is the event? Is the event the flip of the coin (p=1/2) or an awakening (p=1/3)? In the post narrative, this remains unclear.
Which event is meant would become clear if it was a wager (and, generally, if anything whatsoever rested on the question). For example: if she is paid per coin flip for being correct (event=coin flip) then she should bet heads to be correct 1 out of 2 times; if she is paid per awakening for being correct (event=awakening) then she should bet tails to be correct 2 out of 3 times.
Actually .. arguing with myself now .. Beauty wasn’t asked about a probability, she was asked if she thought heads had been flipped, in the past. So this is clear after all—did she think heads was flipped, or not?
Viewing it this way, I see the isomorphism with the class of anthropic arguments that ask if you can deduce something about the longevity of humans given that you are an early human. (Being a human in a certain century is like awakening on a certain day.) I suppose then my solution should be the same. Waking up is not evidence either way that heads or tails was flipped. Since her subjective experience is the same however the coin is flipped (she wakes up) she cannot update upon awakening that it is more likely that tails was flipped. Not even if flipping tails means she wakes up 10 billion times more than if heads was flipped.
However, I will think longer if there are any significant differences between the two problems. Thoughts?
Why was this comment down-voted so low? (I rarely ask, but this time I can’t guess.) Is it too basic math? If people are going to argue whether 1⁄3 or 1⁄2, I think it is useful to know their debating about two different probabilities: the probability of heads or the probability of heads given an awakening.
This is incorrect.
Given that Beauty is being asked the question, the probability that heads had come up is 1⁄2.
This is bayes’ theorem:
p(H)=1/2
p(awakened|H)=p(awakened|T)=1
P(H|awakened)=p(awakened|H)P(H)/(p(awakened|H)p(H)+p(awakened|T)p(T))
which equals 1⁄2
By “awakened” here you mean “awakened at all”. I think you’ve shown already that the probability that heads was flipped given that she was awakened at all is 1⁄2, since in both cases she’s awakened at all and the probability of heads is 1⁄2. I think your dispute is with people who don’t think “I was awakened at all” is all that Beauty knows when she wakes up.
Beauty also knows how many times she it likely to have been woken up when the coin lands heads—and the same for tails. She knew that from the start of the experiment.
OK, I see now why you are emphasizing being awoken at all. That is the relevant event, because that is exactly what she experiences and all that she has to base her decision upon.
(But keep in mind that people are just busy answering different questions, they’re not necessarily incorrect for answering a different question.)