This outcome is bad because bargaining away influence over the AI’s local area in exchange for a small amount of control over the global utility function is a poor trade. But in that case, it’s also a poor acausal trade.
I agree with your logic, but why do you say it’s a bad trade? At first it seemed absurd to me, but after thinking about it I’m able to feel that it’s the best possible outcome. Do you have more specific reasons why it’s bad?
At best it means that the AI shapes our civilization into some sort of twisted extrapolation of what other alien races might like. In the worst case, it ends up calculating a high probability of existence for Evil Abhorrent Alien Race #176 which is in every way antithetical to the human race, and the acausal trade that it makes is that it wipes out the human race (satisfying #176′s desires) so that if the #176 make an AI, that AI will wipe out their race as well (satisfying human desires, since you wouldn’t believe the terrible, inhuman monstrous things those #176s were up to).
I agree with your logic, but why do you say it’s a bad trade? At first it seemed absurd to me, but after thinking about it I’m able to feel that it’s the best possible outcome. Do you have more specific reasons why it’s bad?
At best it means that the AI shapes our civilization into some sort of twisted extrapolation of what other alien races might like. In the worst case, it ends up calculating a high probability of existence for Evil Abhorrent Alien Race #176 which is in every way antithetical to the human race, and the acausal trade that it makes is that it wipes out the human race (satisfying #176′s desires) so that if the #176 make an AI, that AI will wipe out their race as well (satisfying human desires, since you wouldn’t believe the terrible, inhuman monstrous things those #176s were up to).