As the argument goes, you can’t control your past selves, but that isn’t the form of the experiment. The only self that you’re controlling is the one deciding whether to one-box (equivalently, whether to be a one-boxer).
See, that is the self that past Omega is paying attention to in order to figure out how much money to put in the box. That’s right, past Omega is watching current you to figure out whether or not to kill your daughter / put money in the box. It doesn’t matter how he does it, all that matters is whether or not your current self decides to one box.
To follow a thought experiment I found enlightening here, how is it that past Omega knows whether or not you’re a one-boxer? In any simulation he could run of your brain, the simulated you could just know it’s a simulation and then Omega wouldn’t get the correct result, right? But, as we know, he does get the result right, almost all of the time. Ergo, the simulated you looks outside, it sees a bird on a tree. If it uses the bathroom, the toilet might clog. Any giveaway might make the selfish you try to two-box while still one-boxing in real life.
The point? How do you know that current you isn’t the simulation past Omega is using to figure out whether to kill your daughter? Are philosophical claims about the irreducibility of intentionality enough to take the risk?
As the argument goes, you can’t control your past selves, but that isn’t the form of the experiment. The only self that you’re controlling is the one deciding whether to one-box (equivalently, whether to be a one-boxer).
See, that is the self that past Omega is paying attention to in order to figure out how much money to put in the box. That’s right, past Omega is watching current you to figure out whether or not to kill your daughter / put money in the box. It doesn’t matter how he does it, all that matters is whether or not your current self decides to one box.
To follow a thought experiment I found enlightening here, how is it that past Omega knows whether or not you’re a one-boxer? In any simulation he could run of your brain, the simulated you could just know it’s a simulation and then Omega wouldn’t get the correct result, right? But, as we know, he does get the result right, almost all of the time. Ergo, the simulated you looks outside, it sees a bird on a tree. If it uses the bathroom, the toilet might clog. Any giveaway might make the selfish you try to two-box while still one-boxing in real life.
The point? How do you know that current you isn’t the simulation past Omega is using to figure out whether to kill your daughter? Are philosophical claims about the irreducibility of intentionality enough to take the risk?