Actually, if a real-world analog to Newcomb’s Problem ever came up in my real life, there’s a not-insignificant chance that I would turn down the $1000 in the transparent box as well and just walk away—that is, that I would zero-box—under the general principle that if I don’t trust the motives of the person setting up the game I do better not to take any of the choices they are encouraging me to take, no matter how obvious the choices may seem. Maybe I’ve wandered into the next Batman movie the box is poisoned or something.
Of course, if you insist on rejecting the setup to Newcomb’s Problem rather than cooperating with it, you’ll never get to see whether there’s anything valuable being set up.
Actually, if a real-world analog to Newcomb’s Problem ever came up in my real life, there’s a not-insignificant chance that I would turn down the $1000 in the transparent box as well and just walk away—that is, that I would zero-box—under the general principle that if I don’t trust the motives of the person setting up the game I do better not to take any of the choices they are encouraging me to take, no matter how obvious the choices may seem. Maybe I’ve wandered into the next Batman movie the box is poisoned or something.
Of course, if you insist on rejecting the setup to Newcomb’s Problem rather than cooperating with it, you’ll never get to see whether there’s anything valuable being set up.
I think inherent in the problem is the condition that you fully understand what is going on and you know you aren’t part of some weird trick.
It’s not realistic, but being realistic isn’t the point of the problem.