I like the idea of space colonization, but it’s not clear that it’s a practical, let alone robust, way to get our eggs into more baskets.
I read somewhere that to calibrate the logistics of getting everyone off Earth, you should consider how much it would cost and how long it would take to load every human onto a passenger jet and fly them all to the same continent. I wish I could find that essay. Long story short, it would take a loooot of resources. So, it probably won’t be our eggs in particular getting into more baskets, but at least the eggs of some fellow humans.
On existential risk overall, my reading on AI has been pushing me towards the point of view that actually global warming → civilizational collapse may be our best hope for the future, if it can only happen fast enough to prevent the development of a superintelligence.
I see two outcomes: either there are enough exploitable resources left to rebuild a technological civilization, in which case someone will get back to pursuing superintelligence, or there are not enough exploitable resources left to rebuild a technological civilization in which case we piss away our last days throwing spears and dying of dysentery. Or maybe we evolve into non tool-using creatures like in Galapagos. In any case, the left of the Drake Equation remains at zero. Breaking out of the overshoot/collapse cycle means the risk of going out with a bang, but the alternative is the certainty of going out with a whimper.
I read somewhere that to calibrate the logistics of getting everyone off Earth, you should consider how much it would cost and how long it would take to load every human onto a passenger jet and fly them all to the same continent. I wish I could find that essay. Long story short, it would take a loooot of resources. So, it probably won’t be our eggs in particular getting into more baskets, but at least the eggs of some fellow humans.
I see two outcomes: either there are enough exploitable resources left to rebuild a technological civilization, in which case someone will get back to pursuing superintelligence, or there are not enough exploitable resources left to rebuild a technological civilization in which case we piss away our last days throwing spears and dying of dysentery. Or maybe we evolve into non tool-using creatures like in Galapagos. In any case, the left of the Drake Equation remains at zero. Breaking out of the overshoot/collapse cycle means the risk of going out with a bang, but the alternative is the certainty of going out with a whimper.
As far as x-risk is concerned, we all have the same eggs.