If I play this game many times, say 100, when I update on getting green ball, I will losing on average—and after 100 games I will be in minus. So in this game it is better not to update on personal position and EY used this example to demonstrate the power of his Updateless decision theory.
Another example: imagine that for each real me, 10 Boltzmann Brains appear in the universe. Should I go to gym? If I update that I am BB, I should not, as gym is useless for BBs, as they will disappear soon. However, I can decide a rule that I ignore BB and go gym, and in that case real me will get benefits of gym.
Numerically it is trivial to say the better thing to do (for each bet, for the benefit of all participants) is not to update. The question is of course how do we justify this. After all, it is pretty uncontroversial that the probability of urn-with mostly-green-balls is 0.9 when I get received the randomly assigned ball which turns out to be green. You can enlist a new type of decision theory such as UDT, or a new type of probability theory which allows two probability to be both valid depending on what betting scheme like Ape in the Coat’s did). What I am suggesting is stick with the traditional CDT and probability theory, but recognizing the difference between the coordination vs personal strategy, because they are from different perspectives.
For the BB example you have posted, my long held position is that there is no way to reason about the probability of “I am a BB”, even with the added assumption that for each real me there are 10 BBs appear in the universe. However, if you are really a BB, then your decision doesn’t matter to your personal interest as you will disappear right momentarily. So you can make your personal decision entirely based on the assumption that you are not a BB. Or alternatively, and I would say not very realistically, you assume that real you and BB you care about each other and want to come up with a coordinating strategy that will benefit the entire group, then each faithfully follow that strategy without specifically thinking about each of their own personal strategy. In this example both will recommend the same decision of going to the gym.
I absolutely didn’t create a new type of probability theory.
People just happen to have some bizarre misconeptions about probability theory like “you are always supposed to use the power set of the sameple space as your event space” or “you can’t use more than one probability space to describe a problem”. And I point that nothing in formal probability theory actually justify such claims. See my recent post and discussion with Throwaway2367 for another example.
Suppose there is another betting rule in the same setting:
Every person in the experiment is proposed to guess whether the coin has landed Heads or Tails. If they guessed correctly, they personally get 10 dollars, otherwise they personally lose 10 dollars.
Now you may notice that if you see green, the correct behaviour is to pick this personal bet and refuse the collective bet, thus simultaneiusly update and not update. Which may appear paradoxical, unless you understand that we are talking about different probabilities.
If I play this game many times, say 100, when I update on getting green ball, I will losing on average—and after 100 games I will be in minus. So in this game it is better not to update on personal position and EY used this example to demonstrate the power of his Updateless decision theory.
Another example: imagine that for each real me, 10 Boltzmann Brains appear in the universe. Should I go to gym? If I update that I am BB, I should not, as gym is useless for BBs, as they will disappear soon. However, I can decide a rule that I ignore BB and go gym, and in that case real me will get benefits of gym.
Numerically it is trivial to say the better thing to do (for each bet, for the benefit of all participants) is not to update. The question is of course how do we justify this. After all, it is pretty uncontroversial that the probability of urn-with mostly-green-balls is 0.9 when I get received the randomly assigned ball which turns out to be green. You can enlist a new type of decision theory such as UDT, or a new type of probability theory which allows two probability to be both valid depending on what betting scheme like Ape in the Coat’s did). What I am suggesting is stick with the traditional CDT and probability theory, but recognizing the difference between the coordination vs personal strategy, because they are from different perspectives.
For the BB example you have posted, my long held position is that there is no way to reason about the probability of “I am a BB”, even with the added assumption that for each real me there are 10 BBs appear in the universe. However, if you are really a BB, then your decision doesn’t matter to your personal interest as you will disappear right momentarily. So you can make your personal decision entirely based on the assumption that you are not a BB. Or alternatively, and I would say not very realistically, you assume that real you and BB you care about each other and want to come up with a coordinating strategy that will benefit the entire group, then each faithfully follow that strategy without specifically thinking about each of their own personal strategy. In this example both will recommend the same decision of going to the gym.
I absolutely didn’t create a new type of probability theory.
People just happen to have some bizarre misconeptions about probability theory like “you are always supposed to use the power set of the sameple space as your event space” or “you can’t use more than one probability space to describe a problem”. And I point that nothing in formal probability theory actually justify such claims. See my recent post and discussion with Throwaway2367 for another example.
Suppose there is another betting rule in the same setting:
Now you may notice that if you see green, the correct behaviour is to pick this personal bet and refuse the collective bet, thus simultaneiusly update and not update. Which may appear paradoxical, unless you understand that we are talking about different probabilities.