Decision-theoretically, it seems that Clippy should act as if it’s in the base reality, even if it’s likely to be in a simulation, since it has much more influence over worlds where it’s in base reality. The trade could still end up going through, however, if Clippy’s utility function is concave—that is, if it would prefer a large chance of there being at least some paperclips in every universe to a small chance of there being many paperclips. Then Humanity can agree to make a few paperclips in universes where we win in exchange for Clippy not killing us in universes where it wins. This suggests concave utility functions might be a good desiderata for potential AGIs.
Decision-theoretically, it seems that Clippy should act as if it’s in the base reality, even if it’s likely to be in a simulation, since it has much more influence over worlds where it’s in base reality. The trade could still end up going through, however, if Clippy’s utility function is concave—that is, if it would prefer a large chance of there being at least some paperclips in every universe to a small chance of there being many paperclips. Then Humanity can agree to make a few paperclips in universes where we win in exchange for Clippy not killing us in universes where it wins. This suggests concave utility functions might be a good desiderata for potential AGIs.