I think that someone who believes in many-worlds will keep drawing cards until they die.
You have to include the presumption that there is a quantum variable that conditions the skull card, and there is a question about whether a non-quantum event strongly conditioned on a quantum event counts for quantum immortality … but assume Omega can do this.
The payoff, then, looks like it favors going to an arbitrarily high number given that quantum immortality is true. Honestly, my gut response is that I would go to either 3 draws, 9 draws, or 13 draws depending on how risk-averse I felt and how much utility I expected as my baseline (a twice-as-high utility before doubling lets me go one doubling less).
I think this says that my understanding of utility falls prey to diminishing returns when it shouldn’t (partially a problem with utility itself), and that I don’t really believe in quantum immortality—because I am choosing a response that is optimal for a non-quantum immortality scenario.
But in any reasonable situation where I encounter this scenario, my response is accurate: it takes into account my uncertainty about immortality (requires a few more things than just the MWI) and also accounts for me updating my beliefs about quantum immortality based on evidence from the bet. That any agent, even an arbitrarily powerful one, is willing to bet an arbitrarily large number of doublings of my utility against quantum immortality is phenomenal evidence against it. Phenomenal. Utility is so complicated, and doubling just gets insane so quickly.
You have to include the presumption that there is a quantum variable that conditions the skull card, and there is a question about whether a non-quantum event strongly conditioned on a quantum event counts for quantum immortality … but assume Omega can do this.
Neither the problem itself nor this response need make any mention of quantum immortality. Given an understanding of many-worlds ‘belief in quantum immortality’ is just a statement about preferences given a certain type of scenario. There isn’t some kind of special phenomenon involved, just a matter of choosing what sort of preferences you have over future branches.
That any agent, even an arbitrarily powerful one, is willing to bet an arbitrarily large number of doublings of my utility against quantum immortality is phenomenal evidence against it. Phenomenal.
No, no, no! Apart from being completely capricious with essentially arbitrary motivations they aren’t betting against quantum immortality. They are betting a chance of killing someone against a chance of making ridiculous changes to the universe. QI just doesn’t play a part in their payoffs at all.
You have to include the presumption that there is a quantum variable that conditions the skull card, and there is a question about whether a non-quantum event strongly conditioned on a quantum event counts for quantum immortality … but assume Omega can do this.
The payoff, then, looks like it favors going to an arbitrarily high number given that quantum immortality is true. Honestly, my gut response is that I would go to either 3 draws, 9 draws, or 13 draws depending on how risk-averse I felt and how much utility I expected as my baseline (a twice-as-high utility before doubling lets me go one doubling less).
I think this says that my understanding of utility falls prey to diminishing returns when it shouldn’t (partially a problem with utility itself), and that I don’t really believe in quantum immortality—because I am choosing a response that is optimal for a non-quantum immortality scenario.
But in any reasonable situation where I encounter this scenario, my response is accurate: it takes into account my uncertainty about immortality (requires a few more things than just the MWI) and also accounts for me updating my beliefs about quantum immortality based on evidence from the bet. That any agent, even an arbitrarily powerful one, is willing to bet an arbitrarily large number of doublings of my utility against quantum immortality is phenomenal evidence against it. Phenomenal. Utility is so complicated, and doubling just gets insane so quickly.
Neither the problem itself nor this response need make any mention of quantum immortality. Given an understanding of many-worlds ‘belief in quantum immortality’ is just a statement about preferences given a certain type of scenario. There isn’t some kind of special phenomenon involved, just a matter of choosing what sort of preferences you have over future branches.
No, no, no! Apart from being completely capricious with essentially arbitrary motivations they aren’t betting against quantum immortality. They are betting a chance of killing someone against a chance of making ridiculous changes to the universe. QI just doesn’t play a part in their payoffs at all.