But I also think that his claims that we have 100 years (with small probability of extinction)
His claim is that we have 100 years in with we have to be extra careful to prevent Xrisk.
The same maybe said about most other risks—we could create new flu virus even now without any new technologies.
With today’s technology you could create a problematic new virus. On the other hand that hardly would mean extinction. Wearing masks 24⁄7 to filter air isn’t fun but it’s a possible step when we are afraid of airbone viruses.
Our best option to prevent x-risks are international control systems on dangerous tech and lately friendly AI, and we need to do it now, and space colonies have remote and marginal utility.
It’s not like Hawkings doesn’t call for AGI control.
His claim is that we have 100 years in with we have to be extra careful to prevent Xrisk.
With today’s technology you could create a problematic new virus. On the other hand that hardly would mean extinction. Wearing masks 24⁄7 to filter air isn’t fun but it’s a possible step when we are afraid of airbone viruses.
It’s not like Hawkings doesn’t call for AGI control.