This is overly complex. Now we assume that AI goes wrong? These people want to be in a simulation; they need a Schelling point with other humanities. Why wouldn’t they just give clear instructions to the AI to simulate other Earths?
This is overly complex. Now we assume that AI goes wrong? These people want to be in a simulation; they need a Schelling point with other humanities. Why wouldn’t they just give clear instructions to the AI to simulate other Earths?