I would add that the most uncertain thing is the interaction between different x-risks, as it seems that most of them could happen in very short period of time, like 10-20 years just before creation of the powerful AI. I call this epoch “oscillations before the Singularity” and for me the main question is will we be able to survive it.
I would add that the most uncertain thing is the interaction between different x-risks, as it seems that most of them could happen in very short period of time, like 10-20 years just before creation of the powerful AI. I call this epoch “oscillations before the Singularity” and for me the main question is will we be able to survive it.