Yes, it’s a Newcomb-like problem. Anything where one agent predicts another is. People predict other people, with varying degrees of success, in the real world. Ignoring that when looking at decision theories seems silly to me.
Yes, it’s a Newcomb-like problem. Anything where one agent predicts another is. People predict other people, with varying degrees of success, in the real world. Ignoring that when looking at decision theories seems silly to me.