I think they are just using that as an example of a strongly opinionated sub-agent which may be one of many different and highly specific probability assessments of doom.
As for “survival is the default assumption”—what a declaration of that implies on the surface level is that the chance of survival is overwhelming except in the case of a cataclysmic AI scenario. To put it another way: we have a 99% chance of survival so long as we get AGI right.
To put it yet another way—Hollywood has made popular films about the human world being destroyed by Nuclear War, Climate Change, Viral Pandemic, and Asteroid Impact to name a few—different sub-agents could each give higher or lower probabilities to each of those scenarios depending on things like domain knowledge and in concert it raises the question of why we presume that survival is the default? What is the ensemble average of doom?
Is doom more or less likely than survival for any given time frame?
I think they are just using that as an example of a strongly opinionated sub-agent which may be one of many different and highly specific probability assessments of doom.
As for “survival is the default assumption”—what a declaration of that implies on the surface level is that the chance of survival is overwhelming except in the case of a cataclysmic AI scenario. To put it another way:
we have a 99% chance of survival so long as we get AGI right.
To put it yet another way—Hollywood has made popular films about the human world being destroyed by Nuclear War, Climate Change, Viral Pandemic, and Asteroid Impact to name a few—different sub-agents could each give higher or lower probabilities to each of those scenarios depending on things like domain knowledge and in concert it raises the question of why we presume that survival is the default? What is the ensemble average of doom?
Is doom more or less likely than survival for any given time frame?