No, I’m not making any claims about ethics here, just math.
Works against Thirdism in the Fissure experiment too.
Yep, because it’s wrong in Fissure as well. But I’ll be talking about it later.
I mean, if you are going to precommit to the right strategy anyway, why do you even need probability theory?
To understand whether you should precommit to any stratagy and, if you should, then which one. The fact that
P(Heads|Blue) = P(Heads|Red) = 1⁄3
but
P(Heads|Blue or Red) = 1⁄2
means, that you may precommit to either Blue or Red and it doesn’t matter which, but if you don’t precommit, you won’t be able to guess Tails better than chance per experiment.
The whole question is how do you decide to ignore that P(Head|Blue) = 1⁄3, when you chose Red and see Blue. And how is it not “a probabilistic model produces incorrect betting odds”, when you need to precommit to ignore it?
Yes, if the bet is about whether the room takes the color Red in this experiment. Which is what event “Red” means in Technicolor Sleeping Beauty according to the correct model. The fact that you do not observe event Red in this awakening doesn’t mean that you don’t observe it in the experiment as a whole.
The situation is somewhat resembling learning that today is Monday and still being ready to bet at 1:1 that Tuesday awakening will happen in this experiment. Though, with colors there is actually an update from 3⁄4 to 1⁄2.
What you, probably, tried to ask, is whether you should agree to bet at 1:1 odds that the room is Red in this particular awakening after you wake up and saw that the room is Blue. And the answer is no, you shouldn’t. But probability space for Technicolor Sleeping beauty is not talking about probabilities of events happening in this awakening, because most of them are illdefined for reasons explained in the previous post.
And the answer is no, you shouldn’t. But probability space for Technicolor Sleeping beauty is not talking about probabilities of events happening in this awakening, because most of them are illdefined for reasons explained in the previous post.
So probability theory can’t possibly answer whether I should take free money, got it.
And even if “Blue” is “Blue happens during experiment”, you wouldn’t accept worse odds than 1:1 for Blue, even when you see Blue?
So probability theory can’t possibly answer whether I should take free money, got it.
No, that’s not what I said. You just need to use a different probability space with a different event—“observing Red in any particular day of the experiment”.
You can do this because for every day probability to observe the color is the same. Unlike, say, Tails in the initial coin toss which probability is 1⁄2 on Monday and 1 on Tuesday.
It’s indeed a curious thing which I wasn’t thinking about, because you can arrive to the correct betting odds on the color of the room for any day, using the correct model for technicolor sleeping beauty. As P(Red)=P(Blue) and rewards are mutually exclusive, U(Red)=U(Blue) and therefore 1:1 odds. But this was sloppy of me, because to formally update when you observe the outcome you still need an appropriate separate probability space, even if the update is trivial.
So thank you for bringing it up to my attention and, I’m going to talk more about it in a future post.
No, I’m not making any claims about ethics here, just math.
Yep, because it’s wrong in Fissure as well. But I’ll be talking about it later.
To understand whether you should precommit to any stratagy and, if you should, then which one. The fact that
P(Heads|Blue) = P(Heads|Red) = 1⁄3
but
P(Heads|Blue or Red) = 1⁄2
means, that you may precommit to either Blue or Red and it doesn’t matter which, but if you don’t precommit, you won’t be able to guess Tails better than chance per experiment.
You do not ignore it. When you choose red and see that the walls are blue you do not observe event “Blue”. You observe outcome “Blue” which correspond to event “Blue or Red”. Because the sigma-algebra of you probability space is affected by your precommitment.
So you bet 1:1 on Red after observing this “Blue or Red”?
Yes! There is 50% chance that the coin is Tails and so the room is to be Red in this experiment.
No, I mean the Beauty awakes, sees Blue, gets a proposal to bet on Red with 1:1 odds, and you recommend accepting this bet?
Yes, if the bet is about whether the room takes the color Red in this experiment. Which is what event “Red” means in Technicolor Sleeping Beauty according to the correct model. The fact that you do not observe event Red in this awakening doesn’t mean that you don’t observe it in the experiment as a whole.
The situation is somewhat resembling learning that today is Monday and still being ready to bet at 1:1 that Tuesday awakening will happen in this experiment. Though, with colors there is actually an update from 3⁄4 to 1⁄2.
What you, probably, tried to ask, is whether you should agree to bet at 1:1 odds that the room is Red in this particular awakening after you wake up and saw that the room is Blue. And the answer is no, you shouldn’t. But probability space for Technicolor Sleeping beauty is not talking about probabilities of events happening in this awakening, because most of them are illdefined for reasons explained in the previous post.
So probability theory can’t possibly answer whether I should take free money, got it.
And even if “Blue” is “Blue happens during experiment”, you wouldn’t accept worse odds than 1:1 for Blue, even when you see Blue?
No, that’s not what I said. You just need to use a different probability space with a different event—“observing Red in any particular day of the experiment”.
You can do this because for every day probability to observe the color is the same. Unlike, say, Tails in the initial coin toss which probability is 1⁄2 on Monday and 1 on Tuesday.
It’s indeed a curious thing which I wasn’t thinking about, because you can arrive to the correct betting odds on the color of the room for any day, using the correct model for technicolor sleeping beauty. As P(Red)=P(Blue) and rewards are mutually exclusive, U(Red)=U(Blue) and therefore 1:1 odds. But this was sloppy of me, because to formally update when you observe the outcome you still need an appropriate separate probability space, even if the update is trivial.
So thank you for bringing it up to my attention and, I’m going to talk more about it in a future post.