Explanation of how what we really care about when forecasting timelines is not the point when the last human is killed, nor the point where AGI is created, but the point where it’s too late for us to prevent the future from going wrong. And, importantly, this point could come before AGI, or even before TAI. It certainly can come well before the world economy is growing at 10%+ per year. (I give some examples of how this might happen)
Explanation of how what we really care about when forecasting timelines is not the point when the last human is killed, nor the point where AGI is created, but the point where it’s too late for us to prevent the future from going wrong. And, importantly, this point could come before AGI, or even before TAI. It certainly can come well before the world economy is growing at 10%+ per year. (I give some examples of how this might happen)
Here it is