I don’t agree, primarily because it’s only isolated in a vacuum. Other existential risks have more than 1% probability, so if AI risk only had a 1% probability, then we should change focus to another x-risk.
If you can name another immediate threat with a ≥1% chance of killing everyone, then yes, we should drop everything to focus on that too.
A pandemic that kills even just 50% of the population? <0.1%
An unseen meteor? <0.1%
Climate change? 0% chance that it could kill literally everyone
I don’t agree, primarily because it’s only isolated in a vacuum. Other existential risks have more than 1% probability, so if AI risk only had a 1% probability, then we should change focus to another x-risk.
If you can name another immediate threat with a ≥1% chance of killing everyone, then yes, we should drop everything to focus on that too.
A pandemic that kills even just 50% of the population? <0.1%
An unseen meteor? <0.1%
Climate change? 0% chance that it could kill literally everyone