The machine works only by getting you to think certain things, and these things cause your decision. So you decide in the same way you normally do.
I did not say I would one box if I were programmed to two box; I said I would one-box.
And if you were programmed to two-box, and unaware of that fact?
Your response is like responding to “what would you do if there was a 50% chance of you dying tomorrow?” with: “I’d survive”
It completely ignores the point of the situation, and assumes godlike agency.
The machine works only by getting you to think certain things, and these things cause your decision. So you decide in the same way you normally do.
I did not say I would one box if I were programmed to two box; I said I would one-box.
And if you were programmed to two-box, and unaware of that fact?
Your response is like responding to “what would you do if there was a 50% chance of you dying tomorrow?” with: “I’d survive”
It completely ignores the point of the situation, and assumes godlike agency.