I’d one box because there’s no way I’d risk losing a million dollars to get an extra thousand based on arguments about a problem which bores me so much I have trouble paying attention to it.
Best analysis of Newcomb’s Paradox I’ve seen so far—boring. There’s nothing to see here. It all comes down to how you model the situation and what your priors are.
I find it hard to imagine a situation where I have more belief in the Predictor’s ability than the ability of the Predictor to give false evidence that I can’t figure out the trick of.
I’d two box because I see no reason to risk of losing anything. In the face of perceived trickery, I’m all the more betting on causality.
I’d one box because there’s no way I’d risk losing a million dollars to get an extra thousand based on arguments about a problem which bores me so much I have trouble paying attention to it.
What if Box B contains $1,500 instead of $1,000,000 but Omega has still been right 999 times out of 1000?
You did get me to pay a little more attention to the problem. I’d two box in that case. I’m not sure where my crossover is.
Edited to add: I think I got it backwards. I’d still one box. Committing to one-box seems advantageous if Omega is reasonably reliable.
I suppose that then you could numbers on whether the person will reliably keep commitments.
Best analysis of Newcomb’s Paradox I’ve seen so far—boring. There’s nothing to see here. It all comes down to how you model the situation and what your priors are.
I find it hard to imagine a situation where I have more belief in the Predictor’s ability than the ability of the Predictor to give false evidence that I can’t figure out the trick of.
I’d two box because I see no reason to risk of losing anything. In the face of perceived trickery, I’m all the more betting on causality.