It might interest you to know that Eliezer considers MWI to be obviously true:
We have embarrassed our Earth long enough by failing to see the obvious. So for the honor of my Earth, I write as if the existence of many-worlds were an established fact, because it is. The only question now is how long it will take for the people of this world to update.
The reason he is pessimistic about humanity’s survival even though he believes in MWI is because MWI’s being true does not save us.
Although it is possible to set up a special situation (e.g., by connecting a quantum-measurement device to a bomb) in which you will die in one branch, but live in a different branch, most situations aren’t like that. Most situations have you surviving in both branches or dying in both branches.
This seems silly to me—it is true that in a single instance, a quantum coin flip probably can’t save you if classical physics has decided that you’re going to die. But the exponential butterfly effect from all the minuscule changes that occur between splits from now should add up to providing us a huge possible spread of universes by the time AGI will arrive. In some of which the AI will be deadly, and in others, the seed of the AI will be picked just right for it to turn out good, or the exact right method for successful alignment will be the first one discovered.
It might interest you to know that Eliezer considers MWI to be obviously true:
Source. More on MWI by Eliezer..
The reason he is pessimistic about humanity’s survival even though he believes in MWI is because MWI’s being true does not save us.
Although it is possible to set up a special situation (e.g., by connecting a quantum-measurement device to a bomb) in which you will die in one branch, but live in a different branch, most situations aren’t like that. Most situations have you surviving in both branches or dying in both branches.
This seems silly to me—it is true that in a single instance, a quantum coin flip probably can’t save you if classical physics has decided that you’re going to die. But the exponential butterfly effect from all the minuscule changes that occur between splits from now should add up to providing us a huge possible spread of universes by the time AGI will arrive. In some of which the AI will be deadly, and in others, the seed of the AI will be picked just right for it to turn out good, or the exact right method for successful alignment will be the first one discovered.