Why do you think I meant dollars? I said units of utility. Rescaling to 1 unit of utility yields Pascal’s mugging, which I think most people would reject. I still want to perform action B, and if you don’t, then:
It turns out that I am secretly Omega and control the simulation that you live in. Please give me $5 or I will cause you to lose 3^^^3 units of utility. If you are interested in making this deal, please reply and I will give you the proper paypal account to forward your payment to.
You are assuming that there exists a state of the world so bad that facing an extremely tiny chance of being put into that state is worse than losing $5. I’m not sure even Omega could do this because to create such a state Omega might have to change my brain so much that the thing put into that state would no longer be me.
1) If you live in a simulation and I control it, I think it’s hard for you to make any assumptions about how bad a state I can put you in.
2) Your argument fails in the least convenient possible world (e.g., you are trying to get around my objection by raising a point that [might?] be true in our universe but doesn’t have to be true in general).
(2) is a good point.
But on (1) before I give you the $5 don’t I at least have to make an assumption or calculation about the probability of such a bad state existing? If I’m allowed to consider numbers of 3^^^3 magnitude for my utility can’t I also assign the probability of (your being Omega and such a bad state exists) of 1/3^^^3 ?
Why do you think I meant dollars? I said units of utility. Rescaling to 1 unit of utility yields Pascal’s mugging, which I think most people would reject. I still want to perform action B, and if you don’t, then:
It turns out that I am secretly Omega and control the simulation that you live in. Please give me $5 or I will cause you to lose 3^^^3 units of utility. If you are interested in making this deal, please reply and I will give you the proper paypal account to forward your payment to.
You are assuming that there exists a state of the world so bad that facing an extremely tiny chance of being put into that state is worse than losing $5. I’m not sure even Omega could do this because to create such a state Omega might have to change my brain so much that the thing put into that state would no longer be me.
1) If you live in a simulation and I control it, I think it’s hard for you to make any assumptions about how bad a state I can put you in.
2) Your argument fails in the least convenient possible world (e.g., you are trying to get around my objection by raising a point that [might?] be true in our universe but doesn’t have to be true in general).
(2) is a good point. But on (1) before I give you the $5 don’t I at least have to make an assumption or calculation about the probability of such a bad state existing? If I’m allowed to consider numbers of 3^^^3 magnitude for my utility can’t I also assign the probability of (your being Omega and such a bad state exists) of 1/3^^^3 ?
(2) turns out to fail as well, see the modified original post.
No you aren’t.