But if you’re going to rely on the tiny charity of aliens to construct hopeful-feeling scenarios, why not rely on the charity of aliens who anthropically simulate us to recover our mind-states…
This makes sense if identity-as-physical-continuity isn’t part of our (or the aliens’) values. But if it were, then the aliens would potentially have motivation to trade with the paperclip-maximizers to ensure our physical survival, not just rescue our mind-states.
Another thing worth mentioning here is, these nice charitable aliens might not be the only ones in the multiverse trying to influence what happens to our bodies/minds. If there are other aliens whose morality is scary, then who knows what they might want to do with, or have done to, our bodies/minds.
I do think it should be relied upon simulating us, assuming Death With Dignity/MIRI views on AI are correct, such that we can’t align an AGI at all with high probability.
This makes sense if identity-as-physical-continuity isn’t part of our (or the aliens’) values. But if it were, then the aliens would potentially have motivation to trade with the paperclip-maximizers to ensure our physical survival, not just rescue our mind-states.
Another thing worth mentioning here is, these nice charitable aliens might not be the only ones in the multiverse trying to influence what happens to our bodies/minds. If there are other aliens whose morality is scary, then who knows what they might want to do with, or have done to, our bodies/minds.
I do think it should be relied upon simulating us, assuming Death With Dignity/MIRI views on AI are correct, such that we can’t align an AGI at all with high probability.