This seems silly to me—it is true that in a single instance, a quantum coin flip probably can’t save you if classical physics has decided that you’re going to die. But the exponential butterfly effect from all the minuscule changes that occur between splits from now should add up to providing us a huge possible spread of universes by the time AGI will arrive. In some of which the AI will be deadly, and in others, the seed of the AI will be picked just right for it to turn out good, or the exact right method for successful alignment will be the first one discovered.
This seems silly to me—it is true that in a single instance, a quantum coin flip probably can’t save you if classical physics has decided that you’re going to die. But the exponential butterfly effect from all the minuscule changes that occur between splits from now should add up to providing us a huge possible spread of universes by the time AGI will arrive. In some of which the AI will be deadly, and in others, the seed of the AI will be picked just right for it to turn out good, or the exact right method for successful alignment will be the first one discovered.