I came up with the decision theory problem. It has the same moral as xor-blackmail, but I think it’s much easier to understand:
Omega has chosen you for an experiment:
First, Omega predicts your choice in a potential future offer.
Omega rolls a die. Omega doesn’t show you the result.
If Omega predicted you would choose $200, they will only make you an offer if the die shows 6.
If Omega predicted you would choose $100, they will make you an offer if the die shows any number except 1.
Omega’s offer, if made, is simple: “Would you like $100 or $200?”
You received an offer from Omega. Which amount do you choose?
I didn’t come up with a сatchy name, though.
This was easier for me to understand (but everything is easier to understand it when you see it a second time, phrased in a different way).
I came up with the decision theory problem. It has the same moral as xor-blackmail, but I think it’s much easier to understand:
Omega has chosen you for an experiment:
First, Omega predicts your choice in a potential future offer.
Omega rolls a die. Omega doesn’t show you the result.
If Omega predicted you would choose $200, they will only make you an offer if the die shows 6.
If Omega predicted you would choose $100, they will make you an offer if the die shows any number except 1.
Omega’s offer, if made, is simple: “Would you like $100 or $200?”
You received an offer from Omega. Which amount do you choose?
I didn’t come up with a сatchy name, though.
This was easier for me to understand (but everything is easier to understand it when you see it a second time, phrased in a different way).