I flip a coin; if it’s heads, I give you a million dollars, else I give you a thousand dollars. How much money should you get from me? (And is this problem any different from the last one?)
At some point, these questions no longer help us make rational decisions. Even an AI with complete access to its source code can’t do anything to prepare itself for these situations.
No, you don’t, you don’t get to decide. The decision has been made.
You’re ignoring the fact that, normally, the thoughts going on in your brain are PART of how the decision is determined by the laws of physics. In your scenario, they’re irrelevant. Whatever you think, your action is determined by the machine.
The obvious answer is ‘whatever Omega decided’. But I hope that I one-box.
You might as well say in general that you do “whatever the laws of physics determine.”
But you still have to decide, anyway. Hoping doesn’t help.
I flip a coin; if it’s heads, I give you a million dollars, else I give you a thousand dollars. How much money should you get from me? (And is this problem any different from the last one?)
At some point, these questions no longer help us make rational decisions. Even an AI with complete access to its source code can’t do anything to prepare itself for these situations.
No, you don’t, you don’t get to decide. The decision has been made.
You’re ignoring the fact that, normally, the thoughts going on in your brain are PART of how the decision is determined by the laws of physics. In your scenario, they’re irrelevant. Whatever you think, your action is determined by the machine.
EDIT: http://lesswrong.com/lw/2mc/the_smoking_lesion_a_problem_for_evidential/2hx7?c=1 You’ve claimed that you would one-box in this scenario. You’ve claimed therefore, that you would one-box if programmed to two-box.
Ie. you’ve claimed you’re capable of logically impossible acts. Either that, or you don’t understand your own scenario.
The machine works only by getting you to think certain things, and these things cause your decision. So you decide in the same way you normally do.
I did not say I would one box if I were programmed to two box; I said I would one-box.
And if you were programmed to two-box, and unaware of that fact?
Your response is like responding to “what would you do if there was a 50% chance of you dying tomorrow?” with: “I’d survive”
It completely ignores the point of the situation, and assumes godlike agency.