“You didn’t know, but the predictor knew what you’ll do, and if you one-box, that is your property that predictor knew, and you’ll have your reward as a result.”
No. That makes sense only if you believe that causality can work backwards. It can’t.
“If predictor can verify that you’ll one-box (after you understand the rules of the game, yadda yadda), your property of one-boxing is communicated, and it’s all it takes.”
Your property of one-boxing can’t be communicated backwards in time.
We could get bogged down in discussions of free will; I am assuming free will exists, since arguing about the choice to make doesn’t make sense unless free will exists. Maybe the Predictor is always right. Maybe, in this imaginary universe, rationalists are screwed. I don’t care; I don’t claim that rationality is always the best policy in alternate universes where causality doesn’t hold and 2+2=5.
What if I’ve decided I’m going to choose based on a coin flip? Is the Predictor still going to be right? (If you say “yes”, then I’m not going to argue with you anymore on this topic; because that would be arguing about how to apply rules that work in this universe in a different universe.)
Presumably the Predictor would be smart enough to calculate the result of that coin flip.
If it was an actually random bit, then I don’t know. In the real universe, as you require, then the Predictor would have a 50% certainty of being right. Probably if the Predictor thought you might do that it wouldn’t offer you the challenge, in order to maintain its reputation for omniscience.
“You didn’t know, but the predictor knew what you’ll do, and if you one-box, that is your property that predictor knew, and you’ll have your reward as a result.”
No. That makes sense only if you believe that causality can work backwards. It can’t.
“If predictor can verify that you’ll one-box (after you understand the rules of the game, yadda yadda), your property of one-boxing is communicated, and it’s all it takes.”
Your property of one-boxing can’t be communicated backwards in time.
We could get bogged down in discussions of free will; I am assuming free will exists, since arguing about the choice to make doesn’t make sense unless free will exists. Maybe the Predictor is always right. Maybe, in this imaginary universe, rationalists are screwed. I don’t care; I don’t claim that rationality is always the best policy in alternate universes where causality doesn’t hold and 2+2=5.
What if I’ve decided I’m going to choose based on a coin flip? Is the Predictor still going to be right? (If you say “yes”, then I’m not going to argue with you anymore on this topic; because that would be arguing about how to apply rules that work in this universe in a different universe.)
Presumably the Predictor would be smart enough to calculate the result of that coin flip.
If it was an actually random bit, then I don’t know. In the real universe, as you require, then the Predictor would have a 50% certainty of being right. Probably if the Predictor thought you might do that it wouldn’t offer you the challenge, in order to maintain its reputation for omniscience.