Your transformation seems to require weird Omegas that respond to randomizing players by randomizing too. It’s not clear to me why an Omega would want to behave like that (probabilistically reward cheaters). Can you handle other kinds of Omegas, e.g. the original kind specified by Eliezer?
I don’t think they’re weird. I think Omegas that go out of their way to discriminate against mixed strategies are weird. A strategy that one-boxes with probability 0.999 never gets a million, while one that one-boxes with probability 1 always gets a million. You could call that a discontinuity.
And I thought 1 was not a probability anyway! Any real rational one-boxing agent will expect to one-box with probability ~1, not with “probability” 1. Does that mean that the agent is using a mixed strategy? On the other hand, any agent that isn’t using quantum randomness will in fact either one-box or two-box, even if it flips coins and stuff. Does that mean the agent is using a pure strategy? I can’t answer this off the top of my head.
I assume the following is the key thing about Eliezer’s original Omega:
Omega has been correct on each of 100 observed occasions so far—everyone who took both boxes has found box B empty and received only a thousand dollars; everyone who took only box B has found B containing a million dollars
I didn’t see Eleizer saying that Omega doesn’t tolerate mixed strategies. If there were coinflippers among that 100, presumably Omega predicted the results of their coinflips and set up box B accordingly. To the extent that I can’t duplicate the conditions perfectly to make sure any coin will land the same way both times, I can’t do that. To the extent that I can, I can.
Your transformation seems to require weird Omegas that respond to randomizing players by randomizing too. It’s not clear to me why an Omega would want to behave like that (probabilistically reward cheaters). Can you handle other kinds of Omegas, e.g. the original kind specified by Eliezer?
I don’t think they’re weird. I think Omegas that go out of their way to discriminate against mixed strategies are weird. A strategy that one-boxes with probability 0.999 never gets a million, while one that one-boxes with probability 1 always gets a million. You could call that a discontinuity.
And I thought 1 was not a probability anyway! Any real rational one-boxing agent will expect to one-box with probability ~1, not with “probability” 1. Does that mean that the agent is using a mixed strategy? On the other hand, any agent that isn’t using quantum randomness will in fact either one-box or two-box, even if it flips coins and stuff. Does that mean the agent is using a pure strategy? I can’t answer this off the top of my head.
I assume the following is the key thing about Eliezer’s original Omega:
I didn’t see Eleizer saying that Omega doesn’t tolerate mixed strategies. If there were coinflippers among that 100, presumably Omega predicted the results of their coinflips and set up box B accordingly. To the extent that I can’t duplicate the conditions perfectly to make sure any coin will land the same way both times, I can’t do that. To the extent that I can, I can.
Uh, then my transformation of the problem is better than yours because it “predicts” coinflips perfectly, not just “to the extent that I can” :-)