Well, here’s the paradox: strict one-boxers in transparent Newcomb argue that they must one-box always, even when the box is empty, and therefore the boxes will be full.
Not just that, they argue that they must one-box always, even when the box is empty, BECAUSE then the box will be full.
Is that actually committment, or is that just doublethink, ability to hold two contradictory ideas at the same time? How can you commit to taking a course of action (grabbing an empty box) in order to make that course of action (grabbing an empty box) impossible?
And yeah, I’m sure I’d lose at playing transparent Newcomb, but I’m not sure that anyone but a master of doublethink could win it.
I’m not sure that anyone but a master of doublethink could win it.
If I know that I’m going to play transparent Newcomb, and the only way to win at transparent Newcomb is to become a master of doublethink, then I want to become a master of doublethink.
Well, here’s the paradox: strict one-boxers in transparent Newcomb argue that they must one-box always, even when the box is empty, and therefore the boxes will be full.
No, they argue that they must one-box always, even when they think they see the box is empty.
The argument is that you can’t do the Bayesian update P(the box is empty | I see the box as empty) = 1, because Bayesian updating in general fails to “win” when there are other copies of you in the same world, or when others can do source-level predictions of you. Instead, you should use Updateless Decision Theory.
BTW, I don’t think UDT is applicable to most human decisions (or rather, it probably tells you to do the same things as standard decision theory), including things like voting or contributing to charity, or deciding whether to have children, because I think logical correlations between ordinary humans are probably pretty low. (That’s just an intuition though since I don’t know how to do the calculations.)
No, they argue that they must one-box always, even when they think they see the box is empty.
If we can’t trust our senses more than Omega’s predictive powers, then the “transparent” boxes are effectively opaque, and the problem becomes essentially normal Newcomb.
Well, here’s the paradox: strict one-boxers in transparent Newcomb argue that they must one-box always, even when the box is empty, and therefore the boxes will be full.
Not just that, they argue that they must one-box always, even when the box is empty, BECAUSE then the box will be full.
Is that actually committment, or is that just doublethink, ability to hold two contradictory ideas at the same time? How can you commit to taking a course of action (grabbing an empty box) in order to make that course of action (grabbing an empty box) impossible?
And yeah, I’m sure I’d lose at playing transparent Newcomb, but I’m not sure that anyone but a master of doublethink could win it.
If I know that I’m going to play transparent Newcomb, and the only way to win at transparent Newcomb is to become a master of doublethink, then I want to become a master of doublethink.
No, they argue that they must one-box always, even when they think they see the box is empty.
The argument is that you can’t do the Bayesian update P(the box is empty | I see the box as empty) = 1, because Bayesian updating in general fails to “win” when there are other copies of you in the same world, or when others can do source-level predictions of you. Instead, you should use Updateless Decision Theory.
BTW, I don’t think UDT is applicable to most human decisions (or rather, it probably tells you to do the same things as standard decision theory), including things like voting or contributing to charity, or deciding whether to have children, because I think logical correlations between ordinary humans are probably pretty low. (That’s just an intuition though since I don’t know how to do the calculations.)
If we can’t trust our senses more than Omega’s predictive powers, then the “transparent” boxes are effectively opaque, and the problem becomes essentially normal Newcomb.
Ordinary correlations between ordinary humans seem to be pretty high. Do they suffice for our needs? I’m not sure...