Sorry, but I’m not in the habit of taking one for the quantum superteam.
If you’re not willing to “take one for the team” of superyous, I’m not sure you understand the implications of “every implementation of you is you.”
And I don’t think that it really helps to solve the problem;
It does solve the problem, though, because it’s a consistent way to formalize the decision so that on average for things like this you are winning.
it just means that you don’t necessarily care so much about winning any more. Not exactly the point.
I think you’re missing the point here. Winning in this case is doing the thing that on average nets you the most success for problems of this class, one single instance of it notwithstanding.
Plus we are explicitly told that the coin is deterministic and comes down tails in the majority of worlds.
And this explains why you’re missing the point. We are told no such thing. We are told it’s a fair coin and that can only mean that if you divide up worlds by their probability density, you win in half of them. This is defined.
What seems to be confusing you is that you’re told “in this particular problem, for the sake of argument, assume you’re in one of the worlds where you lose.” It states nothing about those worlds being over represented.
We are told no such thing. We are told it’s a fair coin and that can only mean that if you divide up worlds by their probability density, you win in half of them. This is defined.
No, take another look:
in the overwhelming measure of the MWI worlds it gives the same outcome. You don’t care about a fraction that sees a different result, in all reality the result is that Omega won’t even consider giving you $10000, it only asks for your $100.
If you’re not willing to “take one for the team” of superyous, I’m not sure you understand the implications of “every implementation of you is you.”
It does solve the problem, though, because it’s a consistent way to formalize the decision so that on average for things like this you are winning.
I think you’re missing the point here. Winning in this case is doing the thing that on average nets you the most success for problems of this class, one single instance of it notwithstanding.
And this explains why you’re missing the point. We are told no such thing. We are told it’s a fair coin and that can only mean that if you divide up worlds by their probability density, you win in half of them. This is defined.
What seems to be confusing you is that you’re told “in this particular problem, for the sake of argument, assume you’re in one of the worlds where you lose.” It states nothing about those worlds being over represented.
No, take another look: