Let’s look at actual outcomes here. If every human says yes, 95% of them get to the afterlife. If every human says no, 5% of them get to the afterlife. So it seems better to say yes in this case, unless you have access to more information about the world than is specified in this problem. But if you accept that it’s better to say yes here, then you’ve basically accepted the doomsday argument.
There’s a chance you’re changing the nature of the situation by introducing Omega. Often “beliefs” and “betting strategy” go together, but here it may not be the case. You have to prove that the decision in the Omega game has any relation to any other decisions.
There’s a chance this Omega game is only “an additional layer of tautology” which doesn’t justify anything. We need to consider more games. I can suggest a couple of examples.
Game 1:
Omega: There are 2 worlds, one is much more populated than another. In the bigger one magic exists, in the smaller one it doesn’t. Would you bet that magic exists in your world? Would you actually update your beliefs and keep that update?
One person can argue it becomes beneficial to “lie” about your beliefs/adopt temporal doublethink. Another person can argue for permanently changing your mind about magic.
Game 2:
Omega: I have this protocol. When you stand on top of a cliff, I give you a choice to jump or not. If you jump, you die. If you don’t, I create many perfect simulations of this situation. If you jump in a simulation, you get a reward. Wanna jump?
You can argue “jumping means death, the reward is impossible to get”. Unless you have access to true randomness which can vary across perfect copies of the situation. IDK. Maybe “making the Doomsday update beneficially” is impossible.
You did touch on exactly that, so I’m not sure how much my comment agrees with your opinions.
There’s a chance you’re changing the nature of the situation by introducing Omega. Often “beliefs” and “betting strategy” go together, but here it may not be the case. You have to prove that the decision in the Omega game has any relation to any other decisions.
There’s a chance this Omega game is only “an additional layer of tautology” which doesn’t justify anything. We need to consider more games. I can suggest a couple of examples.
Game 1:
One person can argue it becomes beneficial to “lie” about your beliefs/adopt temporal doublethink. Another person can argue for permanently changing your mind about magic.
Game 2:
You can argue “jumping means death, the reward is impossible to get”. Unless you have access to true randomness which can vary across perfect copies of the situation. IDK. Maybe “making the Doomsday update beneficially” is impossible.
You did touch on exactly that, so I’m not sure how much my comment agrees with your opinions.