Imagine that Omega tells you that it threw its coin a million years ago, and would have turned the sky green if it had landed the other way. Back in 2010, I wrote a post arguing that in this sort of situation, since you’ve always seen the sky being blue, and every other human being has also always seen the sky being blue, everyone has always had enough information to conclude that there’s no benefit from paying up in this particular counterfactual mugging, and so there hasn’t ever been any incentive to self-modify into an agent that would pay up … and so you shouldn’t.
I think this sort of reasoning doesn’t work if you also have a precommitment regarding logical facts. Then you know the sky is blue, but you don’t know what that implies. When Omega informs you about the logical connection between sky color, your actions, and your payoff, then you won’t update on this logical fact. This information is one implication away from the logical prior you precommitted yourself to. And the best policy given this prior, which contains information about sky color, but not about this blackmail, is not to pay: not paying will a priori just change the situation in which you will be blackmailed (hence, what blue sky color means), but not the probability of a positive intelligence explosion in the first place. Knowing or not knowing the color of the sky doesn’t make a difference, as long as we don’t know what it implies.
I think this sort of reasoning doesn’t work if you also have a precommitment regarding logical facts. Then you know the sky is blue, but you don’t know what that implies. When Omega informs you about the logical connection between sky color, your actions, and your payoff, then you won’t update on this logical fact. This information is one implication away from the logical prior you precommitted yourself to. And the best policy given this prior, which contains information about sky color, but not about this blackmail, is not to pay: not paying will a priori just change the situation in which you will be blackmailed (hence, what blue sky color means), but not the probability of a positive intelligence explosion in the first place. Knowing or not knowing the color of the sky doesn’t make a difference, as long as we don’t know what it implies.
(HT Lauro Langosco for pointing this out to me.)