No, you don’t, you don’t get to decide. The decision has been made.
You’re ignoring the fact that, normally, the thoughts going on in your brain are PART of how the decision is determined by the laws of physics. In your scenario, they’re irrelevant. Whatever you think, your action is determined by the machine.
No, you don’t, you don’t get to decide. The decision has been made.
You’re ignoring the fact that, normally, the thoughts going on in your brain are PART of how the decision is determined by the laws of physics. In your scenario, they’re irrelevant. Whatever you think, your action is determined by the machine.
EDIT: http://lesswrong.com/lw/2mc/the_smoking_lesion_a_problem_for_evidential/2hx7?c=1 You’ve claimed that you would one-box in this scenario. You’ve claimed therefore, that you would one-box if programmed to two-box.
Ie. you’ve claimed you’re capable of logically impossible acts. Either that, or you don’t understand your own scenario.
The machine works only by getting you to think certain things, and these things cause your decision. So you decide in the same way you normally do.
I did not say I would one box if I were programmed to two box; I said I would one-box.
And if you were programmed to two-box, and unaware of that fact?
Your response is like responding to “what would you do if there was a 50% chance of you dying tomorrow?” with: “I’d survive”
It completely ignores the point of the situation, and assumes godlike agency.