Eliezer: If all I want is money, then I will one-box on Newcomb’s Problem.
Mmm. Newcomb’s Problem features the rather weird case where the relevant agent can predict your behaviour with 100% accuracy. I’m not sure what lessons can be learned from it for the more normal cases where this isn’t true.
Paul, that’s a good point.
Eliezer: If all I want is money, then I will one-box on Newcomb’s Problem.
Mmm. Newcomb’s Problem features the rather weird case where the relevant agent can predict your behaviour with 100% accuracy. I’m not sure what lessons can be learned from it for the more normal cases where this isn’t true.